A robot that pedals a bicycle. Another that cleans the house without ever having received instructions on how to do it. Are we sure we want to continue to consider these machines as "simple household appliances"? The Chinese AgiBot has just revealed Lingxi X2, a versatile humanoid that shatters many of our certainties about what “robotic learning” means. Forget the idea that robots must be programmed and trained for every single action: this automaton learns by observation, just like a child would. Its secret? An artificial intelligence model called GO-1 that transforms observation into practical expertise. It is as if we have just crossed an evolutionary threshold in robotics.
Agibot, a miniature (but not too miniature) humanoid
Don't be fooled by the compact size: X2 is 1,3 meters tall and weighs just over 33 kilos. It is a concentration of technology that sets new standards in terms of “robotics and flexibility of movement”. He walks, runs, turns, dances (better than me, I fear) and above all does things that no one has ever explicitly taught him. This is where the real ability lies: that of generalizing behaviors never seen before.
Because you see, until yesterday the paradigm was simple: if you wanted a robot to do something, you had to program it for that specific task. It was like having a child to whom you had to explain every single action step by step. With X2, AgiBot has changed the rules: look, understand, reply. Period. Creepy? Maybe. Fascinating? Certainly.
The intelligence that observes (and reads inside you)
But it's not just about moving through space. X2 is equipped with a multimodal interaction system that responds in milliseconds. Here's where things get even more interesting (or scary, depending on your level of technological paranoia): the humanoid is able to analyze your facial expressions and tone of voice to accurately identify your emotional state.
X2 even simulates human breathing, silently “observes” the environment and manifests subtle body languages and movements.
I don't know about you, but it makes me feel strange knowing that a robot can "breathe" and interpret emotions. AgiBot argues that these characteristics allow for more appropriate and authentic responses. I wonder what “authentic” means when we talk about machines.
Agibot, the brain behind the scenes
GO-1, the AI model that powers X2, uses a system called “latent actions” that helps the robot understand movements by analyzing both past and current frames. There’s also a component called Latent Planner that predicts action sequences using a specific transformer model.
In performance tests on five different tasks, GO-1 significantly outperformed state-of-the-art models, increasing success rates from 46% to 78%. Particularly impressive are improvements in complex tasks such as pouring water and refilling drinks. Restaurants, estote parati.
Practical applications (and those to come)
AgiBot envisions X2 as a butler, a cleaner, a security guard. But the potential extends to healthcare, education and many other sectors. The company talks about it as an “assistant to human life,” bringing “greater possibilities to the future of intelligent life.”
Here we are again with the eternal question: how much are we willing to delegate to machines? And above all: how much do we want them to become like us? If these robots continue to evolve at this rate, soon the question will no longer be “what can they do,” but “what will we choose to have them do” (I don’t want to get to “what will they choose to do” yet).
It's a thought worth considering as we enjoy the spectacle of a humanoid pedaling.