People are getting conned by scammers who use AI voices of family members in distress

  • Scammers are using artificial intelligence to sound more like family members in distress. People are falling for it and losing thousands of dollars.
  • Advancements in artificial intelligence now allows bad actors to replicate a voice with just an audio sample of a few sentences.
  • A slew of cheap online tools can translate an audio file into a replica of a voice, allowing a swindler to make it “speak” whatever they type.