Description
Leader of the group Machine Learning: Peter Grünwald.
Our research group focuses on how computer programs can learn from and understand data, and then make useful predictions based on it. These algorithms integrate insights from various fields, including statistics, artificial intelligence and neuroscience.
Machinelearning applications are increasingly part of every aspect of life, from speech recognition on cell phones to illness prediction in healthcare. One common problem is extremely polluted data, for which no single model can provide adequate explanations. At CWI we address this issue with statistical machine learning based on combining predictions from different models and experts in order to achieve reliable conclusions.
We also study how networks of neurons in the brain process information, and how modern deeplearning methods can benefit from neuroscience. We develop novel neural networks, like Deep Adaptive Spiking Neural Networks, and also theoretical models of neural learning and information processing in biology. Applications of our work range from lowenergy consumption neural machine learning to neuroprosthetics, to increased insight into the question of how the brain works.
News
Early genetic code very resistant to mutation
Researchers of Centrum Wiskunde & Informatica (CWI) in Amsterdam show that the genetic code is remarkably resistant to DNA replication errors. This might explain the success of the common ancestor of all life, who 3,5 billion years ago developed the genetic code that resides in every organism.
CWI colaunches Dutch Machine Learning Platform website
Ten organizations, including the Centrum Wiskunde & Informatica (CWI) in Amsterdam, launched the Dutch Machine Learning Platform website on 8 July. The URL is: http://www.mlplatform.nl/.
Cum laude for thesis 'Combining Strategies Efficiently' from Wouter Koolen
Computer programs advise on stock market investments
CWI builds new supercomputer
CWI has started the construction of a new supercomputer cluster in the beginning of October 2003. The cluster, consisting of 48 dual and quad AMD Opteron systems, is the first quad Opteron cluster in the Benelux. The new supercomputer, funded by the Netherlands Organization for Scientific Research NWO, is expected to be operational in two months.
Sander Bohte receives NWO grants
The Netherlands Organization for Scientific Research NWO has granted a VENI subsidy to CWI researcher Sander Bohte. Bohte will use the grant, approved in March 2003, to further his research on spiking neural networks. These types of networks incorporate the latest insights in functional biological neurons. In theory they are much more powerful than traditional artificial neural networks. Bothe's work is aimed at using spiking neurons in largescale networks that can learn to deal with symbolic structures like grammar in language or compact descriptions of objects in vision.
Current events
ML Seminar: Rémi Bardenet (CNRS & CRIStAL, Univ. Lille)
 20191031T11:00:00+01:00
 20191031T12:00:00+01:00
ML Seminar: Rémi Bardenet (CNRS & CRIStAL, Univ. Lille)
Start: 20191031 11:00:00+01:00 End: 20191031 12:00:00+01:00
Everyone is welcome to attend the ML seminar of Rémi Bardenet with the title 'DPPs everywhere: repulsive point processes for Monte Carlo integration, signal processing and machine learning'.
Abstract: Determinantal point processes (DPPs) are specific repulsive point processes, which were introduced in the 1970s by Macchi to model fermion beams in quantum optics. More recently, they have been studied as models and sampling tools by statisticians and machine learners. Important statistical quantities associated to DPPs have geometric and algebraic interpretations, which makes them a fun object to study and a powerful algorithmic building block.
After a quick introduction to determinantal point processes, I will discuss some of our recent statistical applications of DPPs. First, we
used DPPs to sample nodes in numerical integration, resulting in Monte Carlo integration with fast convergence with respect to the number of integrand evaluations. Second, we turned DPPs into lowerror variable selection procedures in linear regression. If time allows it, I'll describe a third application where we used DPP machinery to characterize the distribution of the zeros of timefrequency transforms of white noise, a recent challenge in signal processing.
Members
Associated Members
Publications

Borovykh, A.I, Oosterlee, C.W, & Bohte, S.M. (2019). Generalization in fullyconnected neural networks for time series forecasting. Journal of Computational Science, 36, 1–15. doi:10.1016/j.jocs.2019.07.007

Yin, B, Balvert, M, van der Spek, R.A.A, Dutilh, B.E, Bohte, S.M, Veldink, J, & Schönhuth, A. (2019). Using the structure of genome data in the design of deep neural networks for predicting amyotrophic lateral sclerosis from genotype. In Bioinformatics (Vol. 35, pp. i538–i547). doi:10.1093/bioinformatics/btz369

van Doorn, J, Ly, A, Marsman, M, & Wagenmakers, E.J. (2019). Bayesian estimation of Kendall's τ using a latent normal approach. Statistics & Probability Letters, 145, 268–272. doi:10.1016/j.spl.2018.10.004

Zambrano, D, Nusselder, R.B.P, Scholte, H.S, & Bohte, S.M. (2019). Sparse computation in adaptive spiking neural networks. Frontiers in Neuroscience, 12(JAN). doi:10.3389/fnins.2018.00987

Kaufmann, E, KoolenWijkstra, W.M, & Garivier, A. (2018). Sequential test for the lowest mean: From Thompson to Murphy sampling. In Advances in Neural Information Processing Systems (pp. 6332–6342).

Karamanis, M, Zambrano, D, & Bohte, S.M. (2018). Continuoustime spikebased reinforcement learning for working memory tasks. In Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence (pp. 250–262). doi:10.1007/9783030014216_25

Dora, S, Pennartz, C, & Bohte, S.M. (2018). A deep predictive coding network for inferring hierarchical causes underlying sensory inputs. In Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence (pp. 457–467). doi:10.1007/9783030014247_45

Pozzi, I, Nusselder, R.B.P, Zambrano, D, Bohte, S.M, & Iliadis, L. (2018). Gating sensory noise in a spiking subtractive LSTM. In V Kůrková, Y Manolopoulos, B Hammer, & I Maglogiannis (Eds.), Proceedings of Artificial Neural Networks and Machine Learning  ICANN 2018 (pp. 284–293). Springer, Cham. doi:10.1007/9783030014186_28

Grünwald, P.D, & de Heide, R. (2018). Invited discussion to the paper Using Stacking to Average Bayesian Predictive Distributions by Yao, Vehtari, Simpson and Gelman. Bayesian Analysis, 13(3), 917–1003.

Yin, B, Balvert, M, Zambrano, D, Schönhuth, A, & Bohte, S.M. (2018). An image representation based convolutional network for DNA classification. In 6th International Conference on Learning Representations.
Software
Squint: Experimenting in Prediction with Expert Advice problems
Squint provides a codebase to perform numerical proofofconcept experiments in learning theory, particularly in Prediction with Expert Advice problems, a core problem in learning theory.
Current projects with external funding

Efficient Deep Learning Platforms (eDLP)

Enabling Personalized Interventions (EPI)

Safe Bayesian Inference: A Theory of Misspecification based on Statistical Learning (SAFEBAYES)

Spiking Neural Networks research program
Related partners

Philips

KPMG

SURFsara B.V.

Technische Universiteit Eindhoven

Universiteit Twente

Universiteit van Amsterdam

Vrije Universiteit Amsterdam