Artwork

Content provided by Nicola Kelly and ARTICLE 19. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nicola Kelly and ARTICLE 19 or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !

AI's English language problem

35:01
 
Distribuie
 

Manage episode 371926485 series 3494322
Content provided by Nicola Kelly and ARTICLE 19. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nicola Kelly and ARTICLE 19 or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

In this week's episode, we look at the language biases inherent in AI technologies with guests Aliya Bhatia and Gabriel Nicholas from the Center for Democracy and Technology.

Chris, Aliya and Gabriel discuss how large language models work, the underrepresentation of non English-language nations in training data and the effects of an AI trained largely in English. They look at how this might affect communities in India and dialects such as Catalan, where translations are viewed through an anglo-centric lens and the mistakes that commonly get made. And they explore how companies might redress the balance, to ensure a diversity and quality of data which cover different cultural contexts.

Follow the show and don’t miss an episode, published fortnightly on Mondays.

Hosted by tech reporter Chris Stokel-Walker
Produced by Christopher Hooton and Nicola Kelly.

Find out more about ARTICLE 19's work and follow us on:
Twitter: https://twitter.com/article19org
Facebook: https://www.facebook.com/ARTICLE19org
LinkedIn: https://www.linkedin.com/company/article19/

  continue reading

13 episoade

Artwork
iconDistribuie
 
Manage episode 371926485 series 3494322
Content provided by Nicola Kelly and ARTICLE 19. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nicola Kelly and ARTICLE 19 or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

In this week's episode, we look at the language biases inherent in AI technologies with guests Aliya Bhatia and Gabriel Nicholas from the Center for Democracy and Technology.

Chris, Aliya and Gabriel discuss how large language models work, the underrepresentation of non English-language nations in training data and the effects of an AI trained largely in English. They look at how this might affect communities in India and dialects such as Catalan, where translations are viewed through an anglo-centric lens and the mistakes that commonly get made. And they explore how companies might redress the balance, to ensure a diversity and quality of data which cover different cultural contexts.

Follow the show and don’t miss an episode, published fortnightly on Mondays.

Hosted by tech reporter Chris Stokel-Walker
Produced by Christopher Hooton and Nicola Kelly.

Find out more about ARTICLE 19's work and follow us on:
Twitter: https://twitter.com/article19org
Facebook: https://www.facebook.com/ARTICLE19org
LinkedIn: https://www.linkedin.com/company/article19/

  continue reading

13 episoade

Minden epizód

×
 
Loading …

Bun venit la Player FM!

Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.

 

Ghid rapid de referință