Her head turns. Slowly, as if trying to figure out where she is. Her eyelids close, open again. A blink. Then another. Her eyes (cameras, actually) follow something off-screen. Her expression is almost quizzical, like, "What am I looking at?" She seems alive. But she isn't. It's Origin M1, the latest creation by AheadForm, a Chinese startup that has decided to make humanoid robots less rigid and more… human. The video of the robot face posted on YouTube went viral in just a few hours. People watch it, then watch it again. Some find it fascinating. Others find it disturbing. Almost everyone agrees on one thing: from today we are a little closer to Westworld.
Twenty-five engines for a convincing face
Let me say it right away: Origin M1 is not a toy. It is a system designed for research into human-robot interaction, with potential applications in customer service, education, and healthcare. The integrated head 25 brushless motors, tiny devices that work silently to create the subtle movements that make the robot face believable: the shifting gaze, the fluttering eyelids, that little nod of the head that almost seems thoughtful.
They are integrated into the pupils RGB cameras which allow the robot to "see" the environment. Microphones and speakers enable real-time voice interaction. The goal of AheadForm, founded in 2024 in Hangzhou, is clear: to create robots capable of expressing emotions, perceiving their environment, and interacting naturally with humans. A bit like a polite but silent colleague: they observe, listen, and respond.
Previous: Emo and the Science of Robotic Facial Expressions
The founder of AheadForm, Yuhang Hu, has already made people talk about him. In 2024 he published a study on Science Robotics which describes Emo, a facial robot developed at Columbia UniversityEmo can predict a human smile 839 milliseconds before it happens, analyzing the subtle changes in the face of the person in front of him. Then he smiles back, simultaneously. It's not delayed mimesis. It's coexpression.
Research shows that robots capable of anticipating human emotions create more fluid and genuine interactions. The feeling of being understood, even by a machine, improve the quality of the experience. Until you ask yourself: do I really need him to smile at me?
The valley is not so deep anymore
AheadForm has developed several lines of robots. There is the series Elf, with pointed ears and precise control systems. There is the Lan Series, more human-like and designed to be economical. And there is Xuan, a full-body model with an interactive head and advanced expressive capabilities.
The company aims to integrate artificial intelligence systems like large language models directly into robotic heads. The result would be a robot that not only understands what you say, but also how you say it. And responds accordingly, with the right tone, the appropriate expression, the correct timing.
The Origin M1 isn't yet commercially available. But the video is circulating, reactions are multiplying, and the question remains unanswered: When a robot looks you in the eye and blinks, how does that change the way you perceive it? Maybe nothing. Or maybe everything. It depends on how ready we are to accept that a machine can seem "present."
Meanwhile AheadForm He continues to refine the system. And we continue to watch those videos, trying to figure out whether we're feeling curiosity or discomfort. Probably both.