Exoskeletons Turn Humans Into Puppeteers for Robot Training

In the relentless quest to make humanoid robots less clumsy, a new paradigm is emerging: turning humans into real-time puppet masters. Exoskeleton-enabled data acquisition is letting operators directly pilot robots to perform complex tasks, capturing a firehose of high-fidelity data that simulations can only dream of. This approach aims to solve the expensive and slow process of gathering real-world training data.

Enter Daimon Robotics, a Hong Kong-based company, which has developed the DM-EXton, a wearable teleoperation system designed for this exact purpose. An operator wearing the lightweight suit can control a humanoid with high precision, while the robot’s multimodal sensors—capturing vision, force, and crucial tactile feedback—record every nuance of the interaction. This process generates the kind of messy, authentic data essential for training robust AI models, effectively leapfrogging the notoriously difficult “sim-to-real” gap.

Why is this important?

This isn’t just a fancy remote control; it’s a data generation engine. By creating a closed loop of human skill -> robot action -> model training, companies can rapidly build sophisticated behavioral models from a rich dataset. While the immediate goal is smarter robots, the technology has clear offshoots for remote work in hazardous environments, assistive care, and precision manufacturing. With Daimon Robotics showcasing its latest DM-EXton2 system at CES 2026, the industry is clearly betting that the fastest way to build an artificial human is to start with a real one.