Artwork

Content provided by Cybercrime Magazine. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Cybercrime Magazine or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !

Security Nudge. Voice Cloning Scam Risks. Sponsored By CybSafe.

1:25
 
Distribuie
 

Manage episode 426791333 series 2840369
Content provided by Cybercrime Magazine. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Cybercrime Magazine or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
If you got a phone call from someone that sounded exactly like your partner, child, or parent, and they begged you to send money because they were stuck somewhere – would you do it? Of course you would – and you might well become the latest victim of voice cloning scams. Companies have demonstrated new AI tools that can emulate someone’s voice after hearing just three seconds of that individual speaking. Once they have your voiceprint, cybercriminals can use it any way they want – and it’s happening so frequently that the FTC recently held a competition to find the best tools for detecting AI voice clones. If you do get a phone call or voice message asking for money, even if it sounds like a loved one, don’t be fooled. Take a moment to contact that person on a known phone number; text them to ask them to call you; or ask someone else you trust to verify your loved one’s location and status. No matter what the voice on the phone says, you just might find them at home, safe and sound. The 60-second "Security Nudge" is brought to you by CybSafe, developers of the Human Risk Management Platform. Learn more at https://cybsafe.com
  continue reading

4441 episoade

Artwork
iconDistribuie
 
Manage episode 426791333 series 2840369
Content provided by Cybercrime Magazine. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Cybercrime Magazine or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
If you got a phone call from someone that sounded exactly like your partner, child, or parent, and they begged you to send money because they were stuck somewhere – would you do it? Of course you would – and you might well become the latest victim of voice cloning scams. Companies have demonstrated new AI tools that can emulate someone’s voice after hearing just three seconds of that individual speaking. Once they have your voiceprint, cybercriminals can use it any way they want – and it’s happening so frequently that the FTC recently held a competition to find the best tools for detecting AI voice clones. If you do get a phone call or voice message asking for money, even if it sounds like a loved one, don’t be fooled. Take a moment to contact that person on a known phone number; text them to ask them to call you; or ask someone else you trust to verify your loved one’s location and status. No matter what the voice on the phone says, you just might find them at home, safe and sound. The 60-second "Security Nudge" is brought to you by CybSafe, developers of the Human Risk Management Platform. Learn more at https://cybsafe.com
  continue reading

4441 episoade

All episodes

×
 
Loading …

Bun venit la Player FM!

Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.

 

Ghid rapid de referință