Everyone is welcome to attend the online QuSoft seminar with a lecture of Ronald de Wolf: Improved Quantum Boosting.
Abstract:
Boosting is a general method to convert a weak learner (which generates hypotheses that are just slightly better than random) into a strong
learner (which generates hypotheses that are much better than random).
Recently, Arunachalam and Maity gave the first quantum improvement for boosting, by combining Freund and Schapire's AdaBoost algorithm with a
quantum algorithm for approximate counting. Their booster is faster than classical boosting as a function of the VC-dimension of the weak
learner's hypothesis class, but worse as a function of the quality of the weak learner. In this talk we give a substantially faster and
simpler quantum boosting algorithm, based on Servedio's SmoothBoost algorithm.
Based on joint work with Adam Izdebski.