Abstract
Artificial neural networks often rely on fixed architectures and uniform training strategies, overlooking adaptive mechanisms found in biological learning. This work presents a conceptual framework for a biologically inspired training algorithm that mimics human skill acquisition through a progressive, difficulty-driven curriculum. Grounded in principles such as synaptic plasticity, dendritic computation, and modular learning, the proposed method restructures training into stages of increasing complexity, expanding the model’s architecture over time.
Unlike other biologically inspired approaches that redesign neuron structures, this strategy retains standard neural components, allowing its application to conventional neural architectures while introducing an adaptive training schedule aligned with neurodevelopment. The model incrementally grows to match input difficulty, transferring and perturbing learned weights across stages to promote generalization and efficiency. Although not yet empirically validated, this work outlines a plan for systematic evaluation. Future phases will include the implementation of the training methods and generation of staged models. The framework offers a scalable and interpretable path toward energy-efficient learning and contributes to bridging biological and artificial intelligence.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Im, S.J., Shim, J.H., Kim, J.Y., Baek, H.M.: Comparison of differences in brain structure between teenagers and twenties brain using 3 T magnetic resonance imaging. Appl. Mag. Reson. 51(3), 255–276. Springer, Vienna (2020). https://doi.org/10.1007/S00723-020-01192-3
Gilles, F.H.: The developing human brain: differences from adult brain. In: Blüml, S., Panigrahy, A. (eds.) MR Spectroscopy of Pediatric Brain Disorders, LNCS, pp. 3–10. Springer, New York, NY (2013). https://doi.org/10.1007/978-1-4419-5864-8_1
Lippé, S., Kovacevic, N., McIntosh, R.: Differential maturation of brain signal complexity in the human auditory and visual system. Front. Human Neurosci. 3, 48 (2009). Frontiers. https://doi.org/10.3389/NEURO.09.048.2009
Li, H., Xu, Z., Taylor, G., Studer, C., Goldstein, T.: Visualizing the loss landscape of neural nets. Adv. Neural Inf. Process. Syst. 31 (2018). https://doi.org/10.48550/arXiv.1712.09913
Sweatt, J.D.: Neural plasticity and behavior: sixty years of conceptual advances. J. Neurochem. 139(S2), 179–199 (2016)
Bono, J., Wilmes, K.A., Clopath, C.: Modelling plasticity in dendrites: from single cells to networks. Curr. Opin. Neurobiol. 46, 136–141 (2017)
Acharya, J., Basu, A., Legenstein, R., Limbacher, T., Poirazi, P., Wu, X.: Dendritic computing: branching deeper into machine learning. Neuroscience 489, 275–289 (2022)
Branco, T., Häusser, M.: The single dendritic branch as a fundamental functional unit in the nervous system. Curr. Opin. Neurobiol. 20(4), 494–502 (2010)
Ji, J., Tang, C., Zhao, J., Tang, Z., Todo, Y.: A survey on dendritic neuron model: mechanisms, algorithms and practical applications. Neurocomputing 489, 390–406 (2022). https://doi.org/10.1016/j.neucom.2021.08.153
White, C., et al.: Neural architecture search: insights from 1000 papers (2023). arXiv preprint arXiv:2301.08727
Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. In: International Conference on Learning Representations (ICLR) (2019)
An, K., Tsubo, D., Hirayama, Y., Ishii, S.: Flexible learning models utilizing different neural plasticities, Front. Comput. Neurosci. 17, 1125852 (2023). https://doi.org/10.3389/fncom.2023.1125852
Gu, Y., Lee, M., Paik, S.B.: Emergence and reconfiguration of modular structure for artificial neural networks during continual familiarity detection. Nat. Commun. 15(1) (2024)
Rohlfs, C.: Generalization in neural networks: a broad survey (2024). arXiv preprint arXiv:2402.02365
Yang, X., et al.: Recent advances of foundation language models-based continual learning: a survey (2025). https://arxiv.org/abs/2405.18653
Son, J., Lee, S., Kim, G.: When meta-learning meets online and continual learning: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 47(1), 413–428 (2025)
Smith, M.R., Martinez, T., Giraud-Carrier, C.: An instance level analysis of data complexity. Mach. Learn. 95(2), 225–256 (2013). https://doi.org/10.1007/s10994-013-5422-z
Prudencio, R.B.C., Lorena, A.C., et al.: Assessor models for explaining instance hardness in classification problems. In: Proceedings of the International Joint Conference on Neural Networks (IJCNN) (2024). https://doi.org/10.1109/IJCNN60899.2024.10651521
Platanios, E.A., Stretcu, O., Neubig, G., Poczos, B., Mitchell, T.M.: Competence-based curriculum learning for neural machine translation. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1162–1172 (2019). https://doi.org/10.18653/v1/N19-1119
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2026 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Fernandez-Sanchez, A., Gestal, M., Dorado, J., Pazos, A. (2026). From Biological Neurons to Artificial Neural Networks: A Bioinspired Training Alternative. In: Rojas, I., Joya, G., Catala, A. (eds) Advances in Computational Intelligence. IWANN 2025. Lecture Notes in Computer Science, vol 16008. Springer, Cham. https://doi.org/10.1007/978-3-032-02725-2_22
Download citation
DOI: https://doi.org/10.1007/978-3-032-02725-2_22
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-032-02724-5
Online ISBN: 978-3-032-02725-2
eBook Packages: Computer ScienceComputer Science (R0)