The unfortunate phenomenon of automatic telephone calls is on the rise. Today nearly a third of all phone calls are made by automatic dialing programmed to play a pre-recorded message if someone replies. They call them Robocall: they are the reason why every time the phone rings I put myself in the “Rambo firing bursts of machine gun” position.
Don't worry, though. It could be even worse. Experts predict that pernicious robocalls are destined for a high-tech and highly disturbing level of evolution. Scammers will use artificial intelligence. An AI that mimics the voice to make their pre-recorded messages sound as spoken by our friends and family.

In an interview made on Saturday, the academician Tarun Wadhwa he told CNN that scammers could one day use the datasets available online to find someone's relatives. Then they could use that information to create Robocall messages that sound as if they were spoken by a relative of the scam victim.
Robocall "disguised": a widespread nightmare
“It's going to be like Photoshop, something so easy and widespread that we'll stop tracking how it's used against people, and we don't find that surprising,” says Wadhwa. It's easy to imagine situations where these types of voice imitation technologies are used to sow confusion, defraud people, and make scams more accurate and harmful.
Hang up at full blast!
Alex Quilici is the CEO of YouMaillf, an app that prevents unwanted calls. Fortunately, according to him, we still don't have to worry about robocalls pretending to be our grandmothers. He's not completely wrong: for now.
Building an artificial voice right now still involves a fair amount of work. If I wanted to build one that sounded like my sister, for example, today I'd need to get a bunch of samples of her saying specific phonemes and train a computer model on that. And it would be impossible, in my case, because I have no sisters. Oh well, we understand each other.