Boston Dynamics, the company best known for its eerily agile robots that regularly go viral, is now focusing on what’s inside their heads. At Web Summit in Lisbon, CEO Robert Playter outlined a future where the company’s machines can be controlled with natural language and gestures, a significant leap beyond pre-programmed routines. This means you could soon tell a robot what to do, not just how to do it.
The new AI-driven approach aims to turn the company’s robots into more intuitive partners. For Spot, the quadruped already deployed in factories for tasks like thermal inspection and reading gauges, this means an operator could simply ask it to check a specific piece of equipment. This lowers the technical barrier for use in complex environments, from manufacturing floors to nuclear facilities. The company has been exploring the use of Large Language Models (LLMs) to give its robots commonsense knowledge and the flexibility to respond to simple spoken commands.

This push for smarter software extends across their entire lineup. Stretch, the company’s warehouse workhorse, is already on track to move millions of boxes annually and will benefit from AI that allows it to adapt more quickly to chaotic environments like the inside of a shipping container. Meanwhile, the bipedal Atlas continues to serve as the research platform for general-purpose robotics, with recent work focusing on Large Behavior Models (LBMs) that enable the humanoid to tackle complex, multi-step tasks from high-level language prompts.
Why is this important?
Boston Dynamics is signaling a crucial shift from focusing on raw mechanical agility to emphasizing cognitive ability. By integrating advanced AI for natural language and gesture control, the company is aiming to make its robots accessible to non-experts. This move from complex programming to simple conversation represents a “paradigm shift” that could dramatically accelerate the adoption of mobile robots in industrial, commercial, and eventually, domestic settings. It’s no longer just about a robot that can do a backflip; it’s about a robot that understands what you want it to do next.






