Artwork

Konten disediakan oleh MIT OpenCourseWare. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh MIT OpenCourseWare atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.
Player FM - Aplikasi Podcast
Offline dengan aplikasi Player FM !

The Human Element in Machine Learning with Prof. Catherine D’Ignazio, Prof. Jacob Andreas & Harini Suresh

16:03
 
Bagikan
 

Manage episode 318613822 series 2625682
Konten disediakan oleh MIT OpenCourseWare. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh MIT OpenCourseWare atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

When computer science was in its infancy, programmers quickly realized that though computers are astonishingly powerful tools, the results they achieve are only as good as the data you feed into them. (This principle was quickly formalized as GIGO: “Garbage In, Garbage Out.”) What was true in the era of the UNIVAC has proved still to be true in the era of machine learning: among other well-publicized AI fiascos, chatbots that have interacted with bigots have learned to spew racist invective, while facial-recognition software trained solely on images of white people sometimes fails to recognize people of color as human. In this episode, we meet Prof. Catherine D’Ignazio of MIT’s Department of Urban Studies and Planning (DUSP) and Prof. Jacob Andreas and Harini Suresh of the Department of Electrical Engineering and Computer Science. In 2021, D’Ignazio, Andreas, and Suresh collaborated as part of the Social and Ethical Responsibilities of Computing initiative from the Schwarzman College of Computing in a project to teach computer science students in 6.864 Natural Language Processing to recognize how deep learning systems can replicate and magnify the biases inherent in the data sets that are used to train them.

Relevant Resources:

MIT OpenCourseWare

The OCW Educator Portal

Share your teaching insights

Social and Ethical Responsibilities of Computing (SERC) resource on OpenCourseWare

Case Studies in Social and Ethical Responsibilities of Computing

SERC website

Professor D’Ignazio’s faculty page

Professor Andreas’s faculty page

Harini Suresh’s personal website

Desmond Patton’s paper on analysis of communications on Twitter

Music in this episode by Blue Dot Sessions

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On Twitter

On Instagram

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseWare, donate to help keep those programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Script writing assistance by Aubrey Calaway

Show notes by Peter Chipman

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On X

On Instagram

On LinkedIn

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseware, donate to help keep these programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Show notes by Peter Chipman

  continue reading

50 episode

Artwork
iconBagikan
 
Manage episode 318613822 series 2625682
Konten disediakan oleh MIT OpenCourseWare. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh MIT OpenCourseWare atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

When computer science was in its infancy, programmers quickly realized that though computers are astonishingly powerful tools, the results they achieve are only as good as the data you feed into them. (This principle was quickly formalized as GIGO: “Garbage In, Garbage Out.”) What was true in the era of the UNIVAC has proved still to be true in the era of machine learning: among other well-publicized AI fiascos, chatbots that have interacted with bigots have learned to spew racist invective, while facial-recognition software trained solely on images of white people sometimes fails to recognize people of color as human. In this episode, we meet Prof. Catherine D’Ignazio of MIT’s Department of Urban Studies and Planning (DUSP) and Prof. Jacob Andreas and Harini Suresh of the Department of Electrical Engineering and Computer Science. In 2021, D’Ignazio, Andreas, and Suresh collaborated as part of the Social and Ethical Responsibilities of Computing initiative from the Schwarzman College of Computing in a project to teach computer science students in 6.864 Natural Language Processing to recognize how deep learning systems can replicate and magnify the biases inherent in the data sets that are used to train them.

Relevant Resources:

MIT OpenCourseWare

The OCW Educator Portal

Share your teaching insights

Social and Ethical Responsibilities of Computing (SERC) resource on OpenCourseWare

Case Studies in Social and Ethical Responsibilities of Computing

SERC website

Professor D’Ignazio’s faculty page

Professor Andreas’s faculty page

Harini Suresh’s personal website

Desmond Patton’s paper on analysis of communications on Twitter

Music in this episode by Blue Dot Sessions

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On Twitter

On Instagram

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseWare, donate to help keep those programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Script writing assistance by Aubrey Calaway

Show notes by Peter Chipman

Connect with Us

If you have a suggestion for a new episode or have used OCW to change your life or those of others, tell us your story. We’d love to hear from you!

Call us @ 617-715-2517

On our site

On Facebook

On X

On Instagram

On LinkedIn

Stay Current

Subscribe to the free monthly "MIT OpenCourseWare Update" e-newsletter.

Support OCW

If you like Chalk Radio and OpenCourseware, donate to help keep these programs going!

Credits

Sarah Hansen, host and producer

Brett Paci, producer

Dave Lishansky, producer

Show notes by Peter Chipman

  continue reading

50 episode

Semua episode

×
 
Loading …

Selamat datang di Player FM!

Player FM memindai web untuk mencari podcast berkualitas tinggi untuk Anda nikmati saat ini. Ini adalah aplikasi podcast terbaik dan bekerja untuk Android, iPhone, dan web. Daftar untuk menyinkronkan langganan di seluruh perangkat.

 

Panduan Referensi Cepat