Artwork

Konten disediakan oleh AHLA Podcasts. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh AHLA Podcasts atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.
Player FM - Aplikasi Podcast
Offline dengan aplikasi Player FM !

AI in Health Care: Managing Algorithmic Bias and Fairness

37:17
 
Bagikan
 

Manage episode 442974478 series 2772159
Konten disediakan oleh AHLA Podcasts. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh AHLA Podcasts atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

Brad M. Thompson, Partner, Epstein Becker & Green PC, Chris Provan, Managing Director & Senior Principal Data Scientist, Mosaic Data Science, and Sam Tyner-Monroe, Ph.D., Managing Director of Responsible AI, DLA Piper LLP (US), discuss how to analyze and mitigate the risk of bias in artificial intelligence through the lens of data science. They cover HHS’ Section 1557 Final Rule as it pertains to algorithmic bias, examples of biased algorithms, the role of proxies, stratification of algorithms by risk, how to test for biased algorithms, how compliance programs can be adapted to meet the unique needs of algorithmic bias, the NIST Risk Management Framework, whether it’s possible to ever get rid of bias, and how explainability and transparency can mitigate bias. Brad, Chris, and Sam spoke about this topic at AHLA’s 2024 Complexities of AI in Health Care in Chicago, IL.

To learn more about AHLA and the educational resources available to the health law community, visit americanhealthlaw.org.

  continue reading

515 episode

Artwork
iconBagikan
 
Manage episode 442974478 series 2772159
Konten disediakan oleh AHLA Podcasts. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh AHLA Podcasts atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

Brad M. Thompson, Partner, Epstein Becker & Green PC, Chris Provan, Managing Director & Senior Principal Data Scientist, Mosaic Data Science, and Sam Tyner-Monroe, Ph.D., Managing Director of Responsible AI, DLA Piper LLP (US), discuss how to analyze and mitigate the risk of bias in artificial intelligence through the lens of data science. They cover HHS’ Section 1557 Final Rule as it pertains to algorithmic bias, examples of biased algorithms, the role of proxies, stratification of algorithms by risk, how to test for biased algorithms, how compliance programs can be adapted to meet the unique needs of algorithmic bias, the NIST Risk Management Framework, whether it’s possible to ever get rid of bias, and how explainability and transparency can mitigate bias. Brad, Chris, and Sam spoke about this topic at AHLA’s 2024 Complexities of AI in Health Care in Chicago, IL.

To learn more about AHLA and the educational resources available to the health law community, visit americanhealthlaw.org.

  continue reading

515 episode

Все серии

×
 
Loading …

Selamat datang di Player FM!

Player FM memindai web untuk mencari podcast berkualitas tinggi untuk Anda nikmati saat ini. Ini adalah aplikasi podcast terbaik dan bekerja untuk Android, iPhone, dan web. Daftar untuk menyinkronkan langganan di seluruh perangkat.

 

Panduan Referensi Cepat