Figure’s Helix 02: An AI Brain That Finally Does the Dishes

For years, humanoid robots have been the tech world’s gawky adolescents: impressive when performing a choreographed dance or a backflip for the cameras, but hopelessly out of their depth when asked to do a bit of housework. The robotics industry has long been tripped up by “loco-manipulation”—the fiendishly complex challenge of getting a robot to walk and use its hands simultaneously without it collapsing into a heap of expensive scrap and regret.

Enter Figure AI with Helix 02, a new AI model that doesn’t just walk and chew gum; it walks, carries the good china, and unloads a dishwasher in a continuous, four-minute autonomous sequence. This isn’t another over-edited, short-horizon sizzle reel. It is a masterclass in end-to-end robotics: a single neural network controlling an entire humanoid frame, from pixels to torque, finally closing the gap between getting around and getting things done.

The Death of the Digital Frankenstein

Traditionally, making a humanoid do anything useful required a clunky, Frankenstein’s monster of a codebase. One controller would handle the walking, another would step in for stabilisation, and a third would take the baton for reaching and grasping. It was a slow, brittle, and profoundly unnatural way to move. If a plate shifted or a chair was in the way, the whole fragile tower of logic would come crashing down.

“True autonomy requires something fundamentally different: a single learning system that reasons over the whole body at once,” Figure’s announcement states. “A system that continuously perceives, decides, and acts.”

This is the “aha!” moment Helix 02 was built to deliver. Instead of stitching together disparate systems, Figure has engineered a hierarchical AI architecture that thinks and moves as a unified whole.

A Three-Tiered Mind for a Machine

The magic behind Helix 02 lies in a three-system architecture, with each layer operating on its own clock. Think of it as a corporate structure for movement, from the CEO setting the five-year plan to the intern actually shifting the boxes.

  • System 2 (The Strategist): This is the high-level reasoning layer. It digests the scene and the language, breaking down a command like “Unload the dishwasher” into a logical sequence of goals. It moves at a slower pace, focusing on the big picture.
  • System 1 (The Tactician): This is the visuomotor policy that hooks up all the robot’s senses—head cameras, new palm-mounted sensors, and fingertip tactile feedback—to its joints. It translates the CEO’s goals into lightning-fast, 200 Hz full-body commands.
  • System 0 (The Athlete): This is the foundation, a model trained on over 1,000 hours of human motion data. Operating at a blistering 1 kHz, it ensures every movement is stable, balanced, and looks “properly human.” In a move of pure technical bravado, Figure notes that System 0 replaces 109,504 lines of hand-engineered C++ with a single neural network. They’ve essentially sacked a library’s worth of code and hired an AI that learned the ropes by mainlining human movement.
Video thumbnail

This “pixels-to-whole-body” pipeline allows the robot to perform 61 distinct loco-manipulation actions during its four-minute dishwasher ballet. It fluidly transitions between walking, carrying, and placing, even using its hip to nudge a drawer shut when its hands are full—a touch of “Robot King” swagger that feels remarkably natural.

So, What Can It Actually Do?

The dishwasher task is the headline act, but the hardware upgrades on the Figure 03 robot—specifically the palm cameras and tactile sensors—unlock a whole new level of finesse. These sensors provide the haptic feedback required for tasks that were previously impossible for systems relying on sight alone.

The tactile sensors can detect forces as delicate as three grams—sensitive enough to “feel” a paperclip. This opens the door to a new class of fine-motor skills.

Finesse Beyond the Kitchen

Helix 02 was put through a gauntlet of “fiddly” tasks to prove its dexterity wasn’t just a fluke:

  • Unscrewing a bottle cap: This requires precise, bimanual coordination and delicate force control to avoid crushing the plastic.
  • Picking a single pill from an organiser: Utilises the palm cameras for a close-up view when the robot’s “head” cameras are blocked by its own arms.
  • Dispensing exactly 5 ml from a syringe: A task demanding constant tactile feedback to apply smooth, unwavering pressure.
  • Singulating metal parts from a cluttered box: A real-world application from Figure’s own BotQ manufacturing facility, proving the bot can handle the “shambles” of a messy industrial environment.

Analysis: A Paradigm Shift for the Useful Humanoid

While other firms are busy showing off robots that can do backflips or parkour, Figure is laser-focused on the unglamorous but vital challenge of making humanoids actually useful. The leap from the original Helix, which only governed the upper body, to Helix 02’s full-body autonomy in just twelve months is a staggering indicator of how fast this field is moving.

The key takeaway here is the move away from brittle, hand-coded behaviours towards a learned, adaptable system. By training its foundation model on a massive dataset of human movement, Figure is embedding a “natural prior”—an instinct for how a two-legged frame should balance. This frees up the higher-level AI to worry about what to do, while the lower-level system handles the how.

This isn’t about building a robot that does one thing perfectly; it’s about building a platform that can learn to do anything. As Figure CEO Brett Adcock has pointed out, any improvement to the Helix neural network can be beamed out to the entire fleet, meaning every robot learns from the experiences of one. With the robot’s actuators currently running at only 20-25% of their peak speed, the ceiling for performance on this hardware is sky-high.

We are still in the early innings, but this represents a fundamental shift. By solving the problem of continuous, whole-body autonomy, Figure has taken a massive step towards a true general-purpose robot—one that might finally be ready to take over the chores, no state machines required.