The unfortunate phenomenon of automatic telephone calls is on the rise. Today nearly a third of all phone calls are made by automatic dialing programmed to play a pre-recorded message if someone answers. They call them Robocalls: they are the reason why every time the phone rings I get into the "Rambo shooting machine gun" position.
Don't worry, though. It could be even worse. Experts predict that pernicious robocalls are destined for a high-tech and highly disturbing level of evolution. Scammers will use artificial intelligence. An AI that mimics the voice to make their pre-recorded messages sound as spoken by our friends and family.
In an interview conducted on Saturday, the academic Tarun Wadhwa he told CNN that scammers could one day use the datasets available online to find someone's relatives. Then they could use that information to create Robocall messages that sound as if they were spoken by a relative of the scam victim.
“Disguised” robocalls: a widespread nightmare
“It will be like Photoshop, something so easy and widespread that we will stop tracking how it is used against people, and we don't find that surprising,” Wadhwa says. It's easy to imagine situations where these types of voice impersonation technologies are used to sow confusion, defraud people, and make scams more accurate and harmful.
Hang up at full blast!
Alex Quilici is the CEO of YouMaillf, an app that prevents unwanted calls. According to him, fortunately we still don't have to worry about robocalls pretending to be our grandmothers. He's not completely wrong: for now.
Building an artificial voice right now still involves a fair amount of work. If I wanted to build one that sounded like my sister, for example, today I'd need to get a bunch of samples of her saying specific phonemes and train a computer model on that. And it would be impossible, in my case, because I have no sisters. Oh well, we understand each other.