Artwork

Content provided by PyTorch, Edward Yang, and Team PyTorch. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by PyTorch, Edward Yang, and Team PyTorch or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !

Batching

13:37
 
Distribuie
 

Manage episode 300204756 series 2921809
Content provided by PyTorch, Edward Yang, and Team PyTorch. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by PyTorch, Edward Yang, and Team PyTorch or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

PyTorch operates on its input data in a batched manner, typically processing multiple batches of an input at once (rather than once at a time, as would be the case in typical programming). In this podcast, we talk a little about the implications of batching operations in this way, and then also about how PyTorch's API is structured for batching (hint: poorly) and how Numpy introduced a concept of ufunc/gufuncs to standardize over broadcasting and batching behavior. There is some overlap between this podcast and previous podcasts about TensorIterator and vmap; you may also be interested in those episodes.

Further reading.

  continue reading

83 episoade

Artwork

Batching

PyTorch Developer Podcast

32 subscribers

published

iconDistribuie
 
Manage episode 300204756 series 2921809
Content provided by PyTorch, Edward Yang, and Team PyTorch. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by PyTorch, Edward Yang, and Team PyTorch or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

PyTorch operates on its input data in a batched manner, typically processing multiple batches of an input at once (rather than once at a time, as would be the case in typical programming). In this podcast, we talk a little about the implications of batching operations in this way, and then also about how PyTorch's API is structured for batching (hint: poorly) and how Numpy introduced a concept of ufunc/gufuncs to standardize over broadcasting and batching behavior. There is some overlap between this podcast and previous podcasts about TensorIterator and vmap; you may also be interested in those episodes.

Further reading.

  continue reading

83 episoade

Toate episoadele

×
 
Loading …

Bun venit la Player FM!

Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.

 

Ghid rapid de referință

Listen to this show while you explore
Play