The phone rings. It's your daughter, the same voice: excited, almost tearful. "Dad, I've been in an accident. I need money right now to get out of trouble." Panic rises, your heart pounds. But something doesn't quite add up. That pause, that slightly metallic tone. You take a breath, and manage to find a moment to say, "What's our password?" Silence. Then, the call goes dead. It wasn't your daughter.
He was a deepfake voice Generated by an artificial intelligence that cloned his voice from a few seconds of audio stolen from social media. Welcome to the era of intelligent voice scams, where a simple recording is all it takes to become anyone. And where a simple password can be the only thing standing between you and paying criminals thousands of euros.
Cloning a voice costs $5 per month
Voice cloning was once the stuff of spy movies. Now it's here, accessible to anyone with a monthly subscription to platforms like ElevenLabs or Play.ht. A study published in July 2025 on Scientific Reports it shows that 70% of people cannot distinguish a cloned voice from the original. Artificial intelligence systems analyze frequencies, pauses, and intonations and replicate them with near-perfect accuracy. All it takes is five seconds of audio: a voice message on WhatsApp, a video on Instagram, a recorded Zoom call. And that's it.
In Italy, according to the Postal Police, deepfake voice scams increased by 456% between 2024 and 2025. Resemble.AI data tell of an impressive escalation: 22 incidents recorded between 2017 and 2022, 42 in 2023, 150 in 2024. Only In the first quarter of 2025, there were 179 episodes, exceeding the entire previous year by 19%.
The elderly are prime targets. The "nephew in need" scam has caused millions of dollars in damage. But even corporate executives are caught in the net: in Hong Kong, an employee transferred $25 million during a video call with the CFO and other colleagues. All were deepfakes. The video conference looked real. So did the financial loss.
A watchword: low-tech defense against high-tech threats
The solution exists and it is extremely simple: adopt a "safe word", a secret password To use to verify the caller's identity. If you receive a suspicious call from someone claiming to be your child and urgently requesting money, ask for the password; if they don't know it, hang up. If you're in trouble and need to call a family member, say the password: this way, the caller knows it's really you. Period.
The rules for choosing a good password are few but precise. It must be easy to remember but impossible to guess. Do not use pet names, addresses, or dates of birth: This information is often public on social media. A family anecdote is best, such as your child's first word (if different from "mom" or "dad"), or an event known only to you. Share it only with close family members, in person or over the phone. And don't change it unless there's a violation: introducing a new word creates confusion and risks defeating the purpose.
When technology imitates reality too well
The most sensational case in Italy was that of false minister Guido CrosettoThe scammers used software to clone his voice and contact businessmen such as Massimo Moratti, Giorgio Armani, and Patrizio Bertelli. They requested urgent bank transfers for supposedly confidential transactions related to the Cecilia Sala case. A Milanese businessman paid one million euros in two installments. Moratti saw through the scam: "It all seemed absolutely real," he declared. But he did the right thing: he ended the call and checked.
How do you explain a study published in JMIREven though voice cloning software perfectly mimics frequencies and intonations, it still struggles with the natural pauses in human speech. Humans make micropauses, hesitations, and irregular breathing patterns. Machines learn where to insert pauses from training data, but the result is still slightly artificial. This is a detail that escapes most people, especially under emotional stress.
The problem is human, not technological. When faced with a familiar voice calling for urgent help, the brain bypasses logic and activates the protective instinct. This requires a cognitive filter: something that interrupts the emotional automatism and forces the brain to reason. The password works just like that.
Of course, there are objections. In moments of panic, a person in distress could forget the code. It's a real risk. But experts agree: it's better to have an imperfect verification method than no method at all. Some families use security questions instead of a password: "What was the name of our beach motel?" or "What did you eat for your first birthday?" Still others implement two-factor authentication: a password plus a second verification using an app like Google Authenticator.
Technology will continue to improve. As we have already toldSystems like ByteDance's OmniHuman-1 are already creating deepfake videos that are indistinguishable from reality. Real-time voice cloning is just around the corner. But as long as scams rely on pre-recorded or semi-automated calls, a password remains the simplest and most effective defense.
Do you have a password with your family? If not, now's the time to choose one. Before someone else does it for you.