Someday, in the not too distant future, some of your colleagues may be artificial intelligence avatars. You yourself can have an electronic impersonator attending meetings on your behalf. Adorable, isn't it? Yes: but also terrible.
Some time ago I was telling you about hour one, the company that first demonstrated lifelike avatars. It was less than a year ago: it's surprising to see how quickly this sector is evolving. Today, Synthesia has created AI avatars for business professionals. Companies like the opportunity to manage these cyber lookalikes because they are extremely realistic. How realistic? As:
But will ordinary people want them?
Some people think that artificial intelligence will be good for humans. Other people think he will do bad things to us. AI is getting better and better, but some people aren't sure what it will do in the future.
For decades now, the media has portrayed artificial intelligence as equal to or superior to human intelligence, but the reality is very different. At the moment, most AI is made to replace boring tasks. This is also why, after all, no one REALLY sounds the alarm.
But if AI technology enters more “dangerous” fields for us such as creativity, emotions, morality and analytical power, then people will feel threatened. What's more: they will be threatened. Would you be replaced by an AI impersonator, or would you feel replaced by him? Subtle but substantial difference in perception.
A virtual double
Some people already have a hard time distinguishing between a real person and one of the virtual avatars of Synthesia. The company can create generic, customized AI human lookalikes that can be programmed to 'act out' a typed, translated and spoken script in over 50 languages. AI avatars are meant to be used in training materials, corporate communications, and even personalized videos. To create your own AI avatar, just upload a 10-minute video of yourself speaking: one of those videos made with a green screen, so to speak. The technology analyzes the video, catalogs the person's voice and facial expressions. And here is the “perfect” virtual lookalike.
It can make things even more difficult than right now
Synthesia is keen to underline the company's attention to the privacy and security of their customers. Hackers don't care: with much less data than Synthesia needs, they have created plenty of scams. There are already many cases in which scammers have used deepfakes to clone people's voices or faces (and have money transferred to their accounts). The risk of “cultural” manipulation is also large: with an AI lookalike of a politician or influencer, a hacker could convey ambiguous or misleading messages to the public.
Anyway, beyond the possible damage, Synthesia's AI avatars have caused a lot of hype, and today they are already highly chosen by many customers. Over 4500 companies (including giants such as Nike, Google and BBC) have already used a virtual avatar in some way. If anyone is happy about this, it's the global market.
And U.S? Happy with the AI double?
Whether we like it or not, the era of AI lookalikes is here to stay. AI avatars are a sure thing of the near future. They'll save us time, they'll make companies a boatload of money, and they'll probably make us question everything we see.
One day some elderly people of the next decade will come to doubt the video message of love from a child asking for money. And it will do well.