Artwork

Content provided by Super Data Science: ML & AI Podcast with Jon Krohn and Jon Krohn. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Super Data Science: ML & AI Podcast with Jon Krohn and Jon Krohn or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !

759: Full Encoder-Decoder Transformers Fully Explained, with Kirill Eremenko

1:43:13
 
Distribuie
 

Manage episode 401944852 series 1278026
Content provided by Super Data Science: ML & AI Podcast with Jon Krohn and Jon Krohn. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Super Data Science: ML & AI Podcast with Jon Krohn and Jon Krohn or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Encoders, cross attention and masking for LLMs: SuperDataScience Founder Kirill Eremenko returns to the SuperDataScience podcast, where he speaks with Jon Krohn about transformer architectures and why they are a new frontier for generative AI. If you’re interested in applying LLMs to your business portfolio, you’ll want to pay close attention to this episode! This episode is brought to you by Ready Tensor, where innovation meets reproducibility (https://www.readytensor.ai/), by Oracle NetSuite business software (netsuite.com/superdata), and by Intel and HPE Ezmeral Software Solutions (http://hpe.com/ezmeral/chatbots). Interested in sponsoring a SuperDataScience Podcast episode? Visit https://passionfroot.me/superdatascience for sponsorship information. In this episode you will learn: • How decoder-only transformers work [15:51] • How cross-attention works in transformers [41:05] • How encoders and decoders work together (an example) [52:46] • How encoder-only architectures excel at understanding natural language [1:20:34] • The importance of masking during self-attention [1:27:08] Additional materials: www.superdatascience.com/759
  continue reading

782 episoade

Artwork
iconDistribuie
 
Manage episode 401944852 series 1278026
Content provided by Super Data Science: ML & AI Podcast with Jon Krohn and Jon Krohn. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Super Data Science: ML & AI Podcast with Jon Krohn and Jon Krohn or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Encoders, cross attention and masking for LLMs: SuperDataScience Founder Kirill Eremenko returns to the SuperDataScience podcast, where he speaks with Jon Krohn about transformer architectures and why they are a new frontier for generative AI. If you’re interested in applying LLMs to your business portfolio, you’ll want to pay close attention to this episode! This episode is brought to you by Ready Tensor, where innovation meets reproducibility (https://www.readytensor.ai/), by Oracle NetSuite business software (netsuite.com/superdata), and by Intel and HPE Ezmeral Software Solutions (http://hpe.com/ezmeral/chatbots). Interested in sponsoring a SuperDataScience Podcast episode? Visit https://passionfroot.me/superdatascience for sponsorship information. In this episode you will learn: • How decoder-only transformers work [15:51] • How cross-attention works in transformers [41:05] • How encoders and decoders work together (an example) [52:46] • How encoder-only architectures excel at understanding natural language [1:20:34] • The importance of masking during self-attention [1:27:08] Additional materials: www.superdatascience.com/759
  continue reading

782 episoade

All episodes

×
 
Loading …

Bun venit la Player FM!

Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.

 

Ghid rapid de referință