Artwork

Konten disediakan oleh Soroush Pour. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Soroush Pour atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.
Player FM - Aplikasi Podcast
Offline dengan aplikasi Player FM !

Ep 12 - Education & advocacy for AI safety w/ Rob Miles (YouTube host)

1:21:26
 
Bagikan
 

Manage episode 405391218 series 3428190
Konten disediakan oleh Soroush Pour. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Soroush Pour atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

We speak with Rob Miles. Rob is the host of the “Robert Miles AI Safety” channel on YouTube, the single most popular AI alignment video series out there — he has 145,000 subscribers and his top video has ~600,000 views. He goes much deeper than many educational resources out there on alignment, going into important technical topics like the orthogonality thesis, inner misalignment, and instrumental convergence.
Through his work, Robert has educated thousands on AI safety, including many now working on advocacy, policy, and technical research. His work has been invaluable for teaching and inspiring the next generation of AI safety experts and deepening public support for the cause.
Prior to his AIS education work, Robert studied Computer Science at the University of Nottingham.
We talk to Rob about:
* What got him into AI safety
* How he started making educational videos for AI safety
* What he's working on now
* His top advice for people who also want to do education & advocacy work, really in any field, but especially for AI safety
* How he thinks AI safety is currently going as a field of work
* What he wishes more people were working on within AI safety
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Rob --
* Rob Miles AI Safety channel - https://www.youtube.com/@RobertMilesAI
* Twitter - https://twitter.com/robertskmiles
-- Further resources --
* Channel where Rob first started making videos: https://www.youtube.com/@Computerphile
* Podcast ep w/ Eliezer Yudkowsky, who first convinced Rob to take AI safety seriously through reading Yudkowsky's writings: https://lexfridman.com/eliezer-yudkowsky/
Recording date: Nov 21, 2023

  continue reading

15 episode

Artwork
iconBagikan
 
Manage episode 405391218 series 3428190
Konten disediakan oleh Soroush Pour. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Soroush Pour atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

We speak with Rob Miles. Rob is the host of the “Robert Miles AI Safety” channel on YouTube, the single most popular AI alignment video series out there — he has 145,000 subscribers and his top video has ~600,000 views. He goes much deeper than many educational resources out there on alignment, going into important technical topics like the orthogonality thesis, inner misalignment, and instrumental convergence.
Through his work, Robert has educated thousands on AI safety, including many now working on advocacy, policy, and technical research. His work has been invaluable for teaching and inspiring the next generation of AI safety experts and deepening public support for the cause.
Prior to his AIS education work, Robert studied Computer Science at the University of Nottingham.
We talk to Rob about:
* What got him into AI safety
* How he started making educational videos for AI safety
* What he's working on now
* His top advice for people who also want to do education & advocacy work, really in any field, but especially for AI safety
* How he thinks AI safety is currently going as a field of work
* What he wishes more people were working on within AI safety
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Rob --
* Rob Miles AI Safety channel - https://www.youtube.com/@RobertMilesAI
* Twitter - https://twitter.com/robertskmiles
-- Further resources --
* Channel where Rob first started making videos: https://www.youtube.com/@Computerphile
* Podcast ep w/ Eliezer Yudkowsky, who first convinced Rob to take AI safety seriously through reading Yudkowsky's writings: https://lexfridman.com/eliezer-yudkowsky/
Recording date: Nov 21, 2023

  continue reading

15 episode

Semua episode

×
 
Loading …

Selamat datang di Player FM!

Player FM memindai web untuk mencari podcast berkualitas tinggi untuk Anda nikmati saat ini. Ini adalah aplikasi podcast terbaik dan bekerja untuk Android, iPhone, dan web. Daftar untuk menyinkronkan langganan di seluruh perangkat.

 

Panduan Referensi Cepat