+
Skip to main content

From Biological Neurons to Artificial Neural Networks: A Bioinspired Training Alternative

  • Conference paper
  • First Online:
Advances in Computational Intelligence (IWANN 2025)

Abstract

Artificial neural networks often rely on fixed architectures and uniform training strategies, overlooking adaptive mechanisms found in biological learning. This work presents a conceptual framework for a biologically inspired training algorithm that mimics human skill acquisition through a progressive, difficulty-driven curriculum. Grounded in principles such as synaptic plasticity, dendritic computation, and modular learning, the proposed method restructures training into stages of increasing complexity, expanding the model’s architecture over time.

Unlike other biologically inspired approaches that redesign neuron structures, this strategy retains standard neural components, allowing its application to conventional neural architectures while introducing an adaptive training schedule aligned with neurodevelopment. The model incrementally grows to match input difficulty, transferring and perturbing learned weights across stages to promote generalization and efficiency. Although not yet empirically validated, this work outlines a plan for systematic evaluation. Future phases will include the implementation of the training methods and generation of staged models. The framework offers a scalable and interpretable path toward energy-efficient learning and contributes to bridging biological and artificial intelligence.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+
from $39.99 /Month
  • Starting from 10 chapters or articles per month
  • Access and download chapters and articles from more than 300k books and 2,500 journals
  • Cancel anytime
View plans

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Im, S.J., Shim, J.H., Kim, J.Y., Baek, H.M.: Comparison of differences in brain structure between teenagers and twenties brain using 3 T magnetic resonance imaging. Appl. Mag. Reson. 51(3), 255–276. Springer, Vienna (2020). https://doi.org/10.1007/S00723-020-01192-3

  2. Gilles, F.H.: The developing human brain: differences from adult brain. In: Blüml, S., Panigrahy, A. (eds.) MR Spectroscopy of Pediatric Brain Disorders, LNCS, pp. 3–10. Springer, New York, NY (2013). https://doi.org/10.1007/978-1-4419-5864-8_1

  3. Lippé, S., Kovacevic, N., McIntosh, R.: Differential maturation of brain signal complexity in the human auditory and visual system. Front. Human Neurosci. 3, 48 (2009). Frontiers. https://doi.org/10.3389/NEURO.09.048.2009

  4. Li, H., Xu, Z., Taylor, G., Studer, C., Goldstein, T.: Visualizing the loss landscape of neural nets. Adv. Neural Inf. Process. Syst. 31 (2018). https://doi.org/10.48550/arXiv.1712.09913

  5. Sweatt, J.D.: Neural plasticity and behavior: sixty years of conceptual advances. J. Neurochem. 139(S2), 179–199 (2016)

    Article  Google Scholar 

  6. Bono, J., Wilmes, K.A., Clopath, C.: Modelling plasticity in dendrites: from single cells to networks. Curr. Opin. Neurobiol. 46, 136–141 (2017)

    Article  Google Scholar 

  7. Acharya, J., Basu, A., Legenstein, R., Limbacher, T., Poirazi, P., Wu, X.: Dendritic computing: branching deeper into machine learning. Neuroscience 489, 275–289 (2022)

    Article  Google Scholar 

  8. Branco, T., Häusser, M.: The single dendritic branch as a fundamental functional unit in the nervous system. Curr. Opin. Neurobiol. 20(4), 494–502 (2010)

    Article  Google Scholar 

  9. Ji, J., Tang, C., Zhao, J., Tang, Z., Todo, Y.: A survey on dendritic neuron model: mechanisms, algorithms and practical applications. Neurocomputing 489, 390–406 (2022). https://doi.org/10.1016/j.neucom.2021.08.153

  10. White, C., et al.: Neural architecture search: insights from 1000 papers (2023). arXiv preprint arXiv:2301.08727

  11. Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. In: International Conference on Learning Representations (ICLR) (2019)

    Google Scholar 

  12. An, K., Tsubo, D., Hirayama, Y., Ishii, S.: Flexible learning models utilizing different neural plasticities, Front. Comput. Neurosci. 17, 1125852 (2023). https://doi.org/10.3389/fncom.2023.1125852

  13. Gu, Y., Lee, M., Paik, S.B.: Emergence and reconfiguration of modular structure for artificial neural networks during continual familiarity detection. Nat. Commun. 15(1) (2024)

    Google Scholar 

  14. Rohlfs, C.: Generalization in neural networks: a broad survey (2024). arXiv preprint arXiv:2402.02365

  15. Yang, X., et al.: Recent advances of foundation language models-based continual learning: a survey (2025). https://arxiv.org/abs/2405.18653

  16. Son, J., Lee, S., Kim, G.: When meta-learning meets online and continual learning: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 47(1), 413–428 (2025)

    Article  Google Scholar 

  17. Smith, M.R., Martinez, T., Giraud-Carrier, C.: An instance level analysis of data complexity. Mach. Learn. 95(2), 225–256 (2013). https://doi.org/10.1007/s10994-013-5422-z

    Article  MathSciNet  Google Scholar 

  18. Prudencio, R.B.C., Lorena, A.C., et al.: Assessor models for explaining instance hardness in classification problems. In: Proceedings of the International Joint Conference on Neural Networks (IJCNN) (2024). https://doi.org/10.1109/IJCNN60899.2024.10651521

  19. Platanios, E.A., Stretcu, O., Neubig, G., Poczos, B., Mitchell, T.M.: Competence-based curriculum learning for neural machine translation. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 1162–1172 (2019). https://doi.org/10.18653/v1/N19-1119

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alberto Fernandez-Sanchez .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2026 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fernandez-Sanchez, A., Gestal, M., Dorado, J., Pazos, A. (2026). From Biological Neurons to Artificial Neural Networks: A Bioinspired Training Alternative. In: Rojas, I., Joya, G., Catala, A. (eds) Advances in Computational Intelligence. IWANN 2025. Lecture Notes in Computer Science, vol 16008. Springer, Cham. https://doi.org/10.1007/978-3-032-02725-2_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-032-02725-2_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-032-02724-5

  • Online ISBN: 978-3-032-02725-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Keywords

Publish with us

Policies and ethics

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载