Artwork

Konten disediakan oleh Nickle. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Nickle atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.
Player FM - Aplikasi Podcast
Offline dengan aplikasi Player FM !

Possible to turn ChatGPT, Bard, or any LLM from an amnesiac goldfish into a memory mammoth!

1:00:23
 
Bagikan
 

Manage episode 372107452 series 3463837
Konten disediakan oleh Nickle. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Nickle atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

What if your ChatGPT, Bard, or any other LLMs could remember the things you said last month? For example, if you were planning to buy a home near your kids' school, the story you told your son last week, or the gift ideas your wife wouldn't like. That would be amazing, right? However, with the current short memory limit, also known as the context window, these capabilities remain a dream.

But Dr. Burtsev's research has come to our rescue! Thanks to his breakthrough, your LLMs can now accurately remember 1 million tokens, equivalent to several books' worth of information. We are now much closer to having a dreamlike Chat Agent.
Want to know more about what this means for the rest of us? Just tune in to this podcast where Mr. Bursev joins us to discuss his inspiration, how he made it possible, and many other insightful thoughts and ideas about interactive learning, human brain-inspired machine learning algorithms, AGI, Turing test, and, of course, AI safety.

Here is the paper: Scaling Transformer to 1M tokens and beyond with RMT
https://arxiv.org/abs/2304.11062
Find Mr Burtsev's profile here
https://lims.ac.uk/profile/?id=114
Here are various other resources mentioned during the show:
Mike's Linkedin page and information about the IGLU contest (Interactive Grounded Language Understanding)
https://www.linkedin.com/posts/mikhai...
The Society of Mind, Marvin Minsky
https://isbndb.com/book/9780671657130
The Human Brain Project
https://www.humanbrainproject.eu/en/b...
Yann LeCun, JEPA: A Path Towards Autonomous Machine Intelligence
https://www.reddit.com/r/MachineLearn...
Mindstorms in Natural Language-Based Societies of Mind
Jürgen Schmidhuber

  continue reading

2 episode

Artwork
iconBagikan
 
Manage episode 372107452 series 3463837
Konten disediakan oleh Nickle. Semua konten podcast termasuk episode, grafik, dan deskripsi podcast diunggah dan disediakan langsung oleh Nickle atau mitra platform podcast mereka. Jika Anda yakin seseorang menggunakan karya berhak cipta Anda tanpa izin, Anda dapat mengikuti proses yang diuraikan di sini https://id.player.fm/legal.

What if your ChatGPT, Bard, or any other LLMs could remember the things you said last month? For example, if you were planning to buy a home near your kids' school, the story you told your son last week, or the gift ideas your wife wouldn't like. That would be amazing, right? However, with the current short memory limit, also known as the context window, these capabilities remain a dream.

But Dr. Burtsev's research has come to our rescue! Thanks to his breakthrough, your LLMs can now accurately remember 1 million tokens, equivalent to several books' worth of information. We are now much closer to having a dreamlike Chat Agent.
Want to know more about what this means for the rest of us? Just tune in to this podcast where Mr. Bursev joins us to discuss his inspiration, how he made it possible, and many other insightful thoughts and ideas about interactive learning, human brain-inspired machine learning algorithms, AGI, Turing test, and, of course, AI safety.

Here is the paper: Scaling Transformer to 1M tokens and beyond with RMT
https://arxiv.org/abs/2304.11062
Find Mr Burtsev's profile here
https://lims.ac.uk/profile/?id=114
Here are various other resources mentioned during the show:
Mike's Linkedin page and information about the IGLU contest (Interactive Grounded Language Understanding)
https://www.linkedin.com/posts/mikhai...
The Society of Mind, Marvin Minsky
https://isbndb.com/book/9780671657130
The Human Brain Project
https://www.humanbrainproject.eu/en/b...
Yann LeCun, JEPA: A Path Towards Autonomous Machine Intelligence
https://www.reddit.com/r/MachineLearn...
Mindstorms in Natural Language-Based Societies of Mind
Jürgen Schmidhuber

  continue reading

2 episode

Semua episode

×
 
Loading …

Selamat datang di Player FM!

Player FM memindai web untuk mencari podcast berkualitas tinggi untuk Anda nikmati saat ini. Ini adalah aplikasi podcast terbaik dan bekerja untuk Android, iPhone, dan web. Daftar untuk menyinkronkan langganan di seluruh perangkat.

 

Panduan Referensi Cepat