Title: Channel-wise Competitive Learning: Transparent Layer-wise Training for Deep Networks
Abstract:
Backpropagation remains the standard approach for training deep neural networks, but its end-to-end nature offers limited transparency into how intermediate representations are formed and coordinated across layers. It also faces computational inefficiency, and scalability challenges in resource-limited environments. This seminar presents our work on local/layer-wise learning, with a focus on Channel-wise Competitive Learning (CwC), a framework in which layers are trained independently through class-wise channel competition rather than a global error signal. The method provides both layer-wise and class-wise transparency, allowing each layer to be interpreted as a local classifier while preserving modular training. I will present the core idea behind CwC, its empirical behavior on standard image classification benchmarks, and how its transparent structure helps reveal important training dynamics in layer-wise systems. I will also briefly discuss ongoing work on improving stability and scalability, including recent progress on activation/gradient behavior and extensions toward larger-class settings.
Short-Bio:
Andreas Papachristodoulou received his B.Eng. in Computer Systems Engineering from the University of Birmingham in 2018 and his M.Sc. in Robotics from the University of Birmingham in 2019. In 2020, he joined the KIOS Research and Innovation Center of Excellence (KIOS CoE) as a Research Engineer, working on deep learning and computer vision for autonomous vehicles. He is currently a PhD candidate at KIOS CoE and University of Cyprus, and his research focuses on advancing deep learning systems, with particular emphasis on layer-wise and local competitive learning, modular neural networks, and continual and multi-task learning.