Do your eyes glaze over when looking at a long list of annual health insurance enrollment options – or maybe while you’re trying to calculate how much you owe the IRS? You might be wondering the same thing we are: Where’s the guidebook for all of this grown-up stuff? Whether opening a bank account, refinancing student loans, or purchasing car insurance (...um, can we just roll the dice without it?), we’re just as confused as you are. Enter: “Grown-Up Stuff: How to Adult” a podcast dedicated ...
…
continue reading
Konten disediakan oleh Oxford University. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Oxford University atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.
Player FM - Aplikasi Podcast
Offline dengan aplikasi Player FM !
Offline dengan aplikasi Player FM !
Distribution-dependent generalization bounds for noisy, iterative learning algorithms
Manage episode 293114057 series 1610930
Konten disediakan oleh Oxford University. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Oxford University atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.
Karolina Dziugaite (Element AI), gives the OxCSML Seminar on 26th February 2021. Abstract: Deep learning approaches dominate in many application areas. Our understanding of generalization (relating empirical performance to future expected performance) is however lacking. In some applications, standard algorithms like stochastic gradient descent (SGD) reliably return solutions with low test error. In other applications, these same algorithms rapidly overfit. There is, as yet, no satisfying theory explaining what conditions are required for these common algorithms to work in practice. In this talk, I will discuss standard approaches to explaining generalization in deep learning using tools from statistical learning theory, and present some of the barriers these approaches face to explaining deep learning. I will then discuss my recent work (NeurIPS 2019, 2020) on information-theoretic approaches to understanding generalization of noisy, iterative learning algorithms, such as Stochastic Gradient Langevin Dynamics, a noisy version of SGD.
…
continue reading
51 episode
Manage episode 293114057 series 1610930
Konten disediakan oleh Oxford University. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Oxford University atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.
Karolina Dziugaite (Element AI), gives the OxCSML Seminar on 26th February 2021. Abstract: Deep learning approaches dominate in many application areas. Our understanding of generalization (relating empirical performance to future expected performance) is however lacking. In some applications, standard algorithms like stochastic gradient descent (SGD) reliably return solutions with low test error. In other applications, these same algorithms rapidly overfit. There is, as yet, no satisfying theory explaining what conditions are required for these common algorithms to work in practice. In this talk, I will discuss standard approaches to explaining generalization in deep learning using tools from statistical learning theory, and present some of the barriers these approaches face to explaining deep learning. I will then discuss my recent work (NeurIPS 2019, 2020) on information-theoretic approaches to understanding generalization of noisy, iterative learning algorithms, such as Stochastic Gradient Langevin Dynamics, a noisy version of SGD.
…
continue reading
51 episode
All episodes
×Selamat datang di Player FM!
Player FM memindai web untuk mencari podcast berkualitas tinggi untuk Anda nikmati saat ini. Ini adalah aplikasi podcast terbaik dan bekerja untuk Android, iPhone, dan web. Daftar untuk menyinkronkan langganan di seluruh perangkat.