People are getting conned by scammers who use AI voices of family members in distress

  • Scammers are using artificial intelligence to sound more like family members in distress. People are falling for it and losing thousands of dollars.
  • Advancements in artificial intelligence now allows bad actors to replicate a voice with just an audio sample of a few sentences.
  • A slew of cheap online tools can translate an audio file into a replica of a voice, allowing a swindler to make it “speak” whatever they type.
Join 2 million subscribers





A curated newsletter that summarizes the important news at the intersection of Global tech, India Tech and AI.

Delivered 8 AM. Daily.
Total
0
Share
nextbigwhat We would like to show you notifications for the latest news and updates.
Dismiss
Allow Notifications