Description
Leader of the group Machine Learning: Peter Grünwald.
Our research group focuses on how computer programs can learn from and understand data, and then make useful predictions based on it. These algorithms integrate insights from various fields, including statistics, artificial intelligence and neuroscience.
Machinelearning applications are increasingly part of every aspect of life, from speech recognition on cell phones to illness prediction in healthcare. One common problem is extremely polluted data, for which no single model can provide adequate explanations. At CWI we address this issue with statistical machine learning based on combining predictions from different models and experts in order to achieve reliable conclusions.
We also study how networks of neurons in the brain process information, and how modern deeplearning methods can benefit from neuroscience. We develop novel neural networks, like Deep Adaptive Spiking Neural Networks, and also theoretical models of neural learning and information processing in biology. Applications of our work range from lowenergy consumption neural machine learning to neuroprosthetics, to increased insight into the question of how the brain works.
News
CWI researchers selected as ACM Future of Computing Academy members
CWI researchers Tim Baarslag and Wouter Koolen have been selected as members of the of the ACM Future of Computing Academy (FCA).
NWO TOP grant for Peter Grünwald
The Netherlands Organisation for Scientific Research (NWO) has awarded a Physical Sciences TOP grant 1 for curiosity driven research to Peter Grünwald of CWI.
ERCIM News 107 on Machine Learning cocoordinated by Sander Bohte  extra Open Access section
In October 2016, ERCIM News No. 107 was published: http://ercimnews.ercim.eu/en107. It features a special theme on current trends and new paradigms in Machine Learning, coordinated by the guest editors Sander Bohte (CWI ) and Hung Son Nguyen (University of Warsaw).
Bayesian statistics not as robust as commonly thought
The widely used method of Bayesian statistics is not as robust as commonly thought. Researcher Thijs van Ommen of Centrum Wiskunde & Informatica (CWI) discovered that for certain types of problems, Bayesian statistics finds nonexisting patterns in the data. Van Ommen defends his thesis on this topic on Wednesday 10 June at Leiden University.
Current events
ML Seminar: Rémi Bardenet (CNRS & CRIStAL, Univ. Lille)
 20191031T11:00:00+01:00
 20191031T12:00:00+01:00
ML Seminar: Rémi Bardenet (CNRS & CRIStAL, Univ. Lille)
Start: 20191031 11:00:00+01:00 End: 20191031 12:00:00+01:00
Everyone is welcome to attend the ML seminar of Rémi Bardenet with the title 'DPPs everywhere: repulsive point processes for Monte Carlo integration, signal processing and machine learning'.
Abstract: Determinantal point processes (DPPs) are specific repulsive point processes, which were introduced in the 1970s by Macchi to model fermion beams in quantum optics. More recently, they have been studied as models and sampling tools by statisticians and machine learners. Important statistical quantities associated to DPPs have geometric and algebraic interpretations, which makes them a fun object to study and a powerful algorithmic building block.
After a quick introduction to determinantal point processes, I will discuss some of our recent statistical applications of DPPs. First, we
used DPPs to sample nodes in numerical integration, resulting in Monte Carlo integration with fast convergence with respect to the number of integrand evaluations. Second, we turned DPPs into lowerror variable selection procedures in linear regression. If time allows it, I'll describe a third application where we used DPP machinery to characterize the distribution of the zeros of timefrequency transforms of white noise, a recent challenge in signal processing.
Members
Associated Members
Publications

Borovykh, A.I, Oosterlee, C.W, & Bohte, S.M. (2019). Generalization in fullyconnected neural networks for time series forecasting. Journal of Computational Science, 36, 1–15. doi:10.1016/j.jocs.2019.07.007

Yin, B, Balvert, M, van der Spek, R.A.A, Dutilh, B.E, Bohte, S.M, Veldink, J, & Schönhuth, A. (2019). Using the structure of genome data in the design of deep neural networks for predicting amyotrophic lateral sclerosis from genotype. In Bioinformatics (Vol. 35, pp. i538–i547). doi:10.1093/bioinformatics/btz369

van Doorn, J, Ly, A, Marsman, M, & Wagenmakers, E.J. (2019). Bayesian estimation of Kendall's τ using a latent normal approach. Statistics & Probability Letters, 145, 268–272. doi:10.1016/j.spl.2018.10.004

Zambrano, D, Nusselder, R.B.P, Scholte, H.S, & Bohte, S.M. (2019). Sparse computation in adaptive spiking neural networks. Frontiers in Neuroscience, 12(JAN). doi:10.3389/fnins.2018.00987

Kaufmann, E, KoolenWijkstra, W.M, & Garivier, A. (2018). Sequential test for the lowest mean: From Thompson to Murphy sampling. In Advances in Neural Information Processing Systems (pp. 6332–6342).

Karamanis, M, Zambrano, D, & Bohte, S.M. (2018). Continuoustime spikebased reinforcement learning for working memory tasks. In Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence (pp. 250–262). doi:10.1007/9783030014216_25

Dora, S, Pennartz, C, & Bohte, S.M. (2018). A deep predictive coding network for inferring hierarchical causes underlying sensory inputs. In Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence (pp. 457–467). doi:10.1007/9783030014247_45

Pozzi, I, Nusselder, R.B.P, Zambrano, D, Bohte, S.M, & Iliadis, L. (2018). Gating sensory noise in a spiking subtractive LSTM. In V Kůrková, Y Manolopoulos, B Hammer, & I Maglogiannis (Eds.), Proceedings of Artificial Neural Networks and Machine Learning  ICANN 2018 (pp. 284–293). Springer, Cham. doi:10.1007/9783030014186_28

Grünwald, P.D, & de Heide, R. (2018). Invited discussion to the paper Using Stacking to Average Bayesian Predictive Distributions by Yao, Vehtari, Simpson and Gelman. Bayesian Analysis, 13(3), 917–1003.

Yin, B, Balvert, M, Zambrano, D, Schönhuth, A, & Bohte, S.M. (2018). An image representation based convolutional network for DNA classification. In 6th International Conference on Learning Representations.
Software
Squint: Experimenting in Prediction with Expert Advice problems
Squint provides a codebase to perform numerical proofofconcept experiments in learning theory, particularly in Prediction with Expert Advice problems, a core problem in learning theory.
Current projects with external funding

Efficient Deep Learning Platforms (eDLP)

Enabling Personalized Interventions (EPI)

Safe Bayesian Inference: A Theory of Misspecification based on Statistical Learning (SAFEBAYES)

Spiking Neural Networks research program
Related partners

Philips

KPMG

SURFsara B.V.

Technische Universiteit Eindhoven

Universiteit Twente

Universiteit van Amsterdam

Vrije Universiteit Amsterdam