Filter By:

Journal Check one or more journals to show results from those journals only.

Choose more journals

Article type Check one or more article types to show results from those article types only.
Subject Check one or more subjects to show results from those subjects only.
Date Choose a date option to show results from those dates only.

Custom date range

Clear all filters
Sort by:
Showing 1–3 of 3 results
Advanced filters: Author: Benjamin Scellier Clear advanced filters
  • Artificial neural networks can be trained using backpropagation because of the sophisticated digital hardware they run on. Here, the authors show how many simple physical systems can autonomously be trained to perform complex computations without needing to interface with any digital hardware.

    • Martin J. Falk
    • Adam T. Strupp
    • Arvind Murugan
    ResearchOpen Access
    Nature Communications
    Volume: 16, P: 1-13
  • Methods to train physical neural networks, such as backpropagation-based and backpropagation-free approaches, are explored to allow scaling up of artificial intelligence models far beyond present small-scale laboratory demonstrations, potentially enhancing computational efficiency.

    • Ali Momeni
    • Babak Rahmani
    • Romain Fleury
    Reviews
    Nature
    Volume: 645, P: 53-61
  • A deep network is best understood in terms of components used to design it—objective functions, architecture and learning rules—rather than unit-by-unit computation. Richards et al. argue that this inspires fruitful approaches to systems neuroscience.

    • Blake A. Richards
    • Timothy P. Lillicrap
    • Konrad P. Kording
    Reviews
    Nature Neuroscience
    Volume: 22, P: 1761-1770