The Terryifying AI Scam That Uses Your Loved One’s Voice
2024-03-31
This article describes a new AI scam targeting individuals by mimicking the voices of their loved ones to create highly convincing and emotionally manipulative situations for extortion. It highlights the increasing sophistication of AI technologies in cloning voices and the challenges in combating such scams, alongside the emotional and psychological impact on victims. One thing worth considering is the creation of a “Family Safe Word” that can be asked for in a situation where the provenance of a caller is under question. Making sure everyone in the family remembers the safe word without having to write it down is the next challenge.
Was this useful?