Statistical Learning Seminars

The effects of CoViD-19 pervade through research communities across the globe, causing canceled conferences, postponed research visits, and suspended projects. Like many others, we have sought other opportunities for collaboration in spite of the current state of affairs and have therefore organized this online seminar series in statistical learning.

Format

We use zoom for all the sessions. Upon joining the seminar, you will be placed in a waiting room; please wait for the host to let you in to the meeting.

The seminars are approximately an hour long with anywhere between 20 and 40 minutes allocated to the presentation and the rest for discussion. Sessions are held on a regular basis on Fridays at 15:30 CET. See Previous Talks for recordings, slides, and resources from previous seminars.

https://lu-se.zoom.us/j/65067339175

Mailing List

To receive announcements for upcoming seminars, please join the group at https://groups.google.com/g/statlearnsem.

Calendar Event

Link to calendar event

Upcoming Talks

April 23, 15:30 CET

Pragya Sur (Harvard University)

Title
A precise high-dimensional asymptotic theory for AdaBoost
Abstract
This talk will introduce a precise high-dimensional asymptotic theory for AdaBoost on separable data, taking both statistical and computational perspectives. We will consider the common modern setting where the number of features p and the sample size n are both large and comparable, and in particular, look at scenarios where the data is separable in an asymptotic sense. Under a class of statistical models, we will provide an (asymptotically) exact analysis of the generalization error of AdaBoost, when the algorithm interpolates the training data and maximizes an empirical L1 margin. On the computational front, we provide a sharp analysis of the stopping time when boosting approximately maximizes the empirical L1 margin. Our theory provides several insights into properties of Boosting; for instance, the larger the dimensionality ratio p/n, the faster the optimization reaches interpolation. At the heart of our theory lies an in-depth study of the maximum L1-margin, which can be accurately described by a new system of non-linear equations; we analyze this margin and the properties of this system, using Gaussian comparison techniques and a novel uniform deviation argument. Time permitting, I will present a new class of boosting algorithms that correspond to Lq geometry, for q>1, together with results on their high-dimensional generalization and optimization behavior. This is based on joint work with Tengyuan Liang.
Related Papers
A Precise High-Dimensional Asymptotic Theory for Boosting and Minimum-L1-Norm Interpolated Classifiers

May 28, 15:30 CET

Hanwen Huang (University of Georgia)

Title
TBA
Abstract
TBA
Related Papers
TBA

Organization

This seminar series is a joint effort organized by The Department of Mathematics, Wrocław University, The Department of Mathematics, University of Burgundy, and The Department of Statistics, Lund University.

Lund University
University of Burgundy
Wroclaw University