Artwork

Content provided by Soroush Pour. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Soroush Pour or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !

Ep 13 - AI researchers expect AGI sooner w/ Katja Grace (Co-founder & Lead Researcher, AI Impacts)

1:20:28
 
Distribuie
 

Manage episode 424358908 series 3428190
Content provided by Soroush Pour. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Soroush Pour or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

We speak with Katja Grace. Katja is the co-founder and lead researcher at AI Impacts, a research group trying to answer key questions about the future of AI — when certain capabilities will arise, what will AI look like, how it will all go for humanity.
We talk to Katja about:
* How AI Impacts latest rigorous survey of leading AI researchers shows they've dramatically reduced their timelines to when AI will successfully tackle all human tasks & occupations.
* The survey's methodology and why we can be confident in its results
* Responses to the survey
* Katja's journey into the field of AI forecasting
* Katja's thoughts about the future of AI, given her long tenure studying AI futures and its impacts
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- Follow Katja --
* Website: https://katjagrace.com/
* Twitter: https://x.com/katjagrace
-- Further resources --
* The 2023 survey of AI researchers views: https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_expert_survey_on_progress_in_ai
* AI Impacts: https://aiimpacts.org/
* AI Impacts' Substack: https://blog.aiimpacts.org/
* Joe Carlsmith on Power Seeking AI: https://arxiv.org/abs/2206.13353
* Abbreviated version: https://joecarlsmith.com/2023/03/22/existential-risk-from-power-seeking-ai-shorter-version
* Fragile World hypothesis by Nick Bostrom: https://nickbostrom.com/papers/vulnerable.pdf
Recorded Feb 22, 2024

  continue reading

15 episoade

Artwork
iconDistribuie
 
Manage episode 424358908 series 3428190
Content provided by Soroush Pour. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Soroush Pour or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

We speak with Katja Grace. Katja is the co-founder and lead researcher at AI Impacts, a research group trying to answer key questions about the future of AI — when certain capabilities will arise, what will AI look like, how it will all go for humanity.
We talk to Katja about:
* How AI Impacts latest rigorous survey of leading AI researchers shows they've dramatically reduced their timelines to when AI will successfully tackle all human tasks & occupations.
* The survey's methodology and why we can be confident in its results
* Responses to the survey
* Katja's journey into the field of AI forecasting
* Katja's thoughts about the future of AI, given her long tenure studying AI futures and its impacts
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- Follow Katja --
* Website: https://katjagrace.com/
* Twitter: https://x.com/katjagrace
-- Further resources --
* The 2023 survey of AI researchers views: https://wiki.aiimpacts.org/ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/2023_expert_survey_on_progress_in_ai
* AI Impacts: https://aiimpacts.org/
* AI Impacts' Substack: https://blog.aiimpacts.org/
* Joe Carlsmith on Power Seeking AI: https://arxiv.org/abs/2206.13353
* Abbreviated version: https://joecarlsmith.com/2023/03/22/existential-risk-from-power-seeking-ai-shorter-version
* Fragile World hypothesis by Nick Bostrom: https://nickbostrom.com/papers/vulnerable.pdf
Recorded Feb 22, 2024

  continue reading

15 episoade

Alle episoder

×
 
Loading …

Bun venit la Player FM!

Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.

 

Ghid rapid de referință