Next to the technological challenge of biocompatible interfacing, a central issue is that of understanding the signals that the central nervous systems emits, and also understanding how to encode external sensory signals so that the brain can understand them. Biologically compatible models of neural-signal processing are thus central to efforts ranging from cochlear implants (invasive hearing aids) to artificial retinas to artificial limbs.
The spiking neuron models developed by our Machine Learning group directly model the behaviour of real neurons while demonstrating an efficient neural coding model. Work underway is, for instance, improving our understanding of how the hearing nerve encodes information, and promising to better the efficacy of cochlear implants. External sensory data can also be encoded directly towards brain interfaces using so-called spiking neural networks.
Contact person: Sander Bohte
Research group: Machine Learning (ML)
Research partner: LUMC