AI Research | 8/23/2025
Figure's AI Humanoid Walks Blind, Camera-Free
Figure has demonstrated its humanoid robot maintaining balance and walking without any visual input, relying on internal sensors and a learned control system. The team trained a neural policy in a high-fidelity simulation and then transferred it to the real robot in a zero-shot regime using domain randomization. The result signals a stable, vision-free foundation for robots in warehouses and factories.
Introduction
In a lab where sensors hum and metal settles into precise stances, Figure is pushing a familiar boundary: a humanoid that can walk and balance without looking. The company’s latest tests show a robot that keeps upright and keeps moving even when engineers give it a shove. No cameras, no vision system. Just gut-level balance, a rich stream of proprioceptive feedback, and a neural policy trained to read the world from inside the robot’s own body.
The breakthrough at a glance
Think of a robot that learns to walk the way a novice skateboarder does—by trying, falling, and adjusting—only this time the lessons happen inside a simulated city of thousands of digital twins. Figure’s engineers built an end-to-end neural network, nicknamed the Helix walking controller, and trained it in a high-fidelity physics simulation. In that virtual world, thousands of copies of the Figure 02 robot face an array of terrains, from slick hardwood to uneven concrete, with shifting friction, simulated trips, and even deliberate pushes. As in any good learning loop, the model earns its keep when it makes stable, human-like movements—heel-strikes landing with the right timing, toe-offs coordinating with arm swings—and it pays for mistakes with a tumble.
From simulation to the real world, zero-shot
The standout feature here is the “zero-shot” transfer: a policy learned entirely in simulation works on the physical robot without additional fine-tuning. The trick is domain randomization, where the simulator randomizes the robot’s physical parameters during training. By exposing the AI to thousands of subtly different bodies and surfaces, it becomes robust enough to handle real-world variance when the policy is deployed on the actual machine.
When deployed on the Figure 02, the robot relies on proprioception rather than sight. Internal sensors measure joint angles, forces, and body orientation, providing a continuous sense of where the limbs are and how they’re moving. In effect, the robot learns to walk with its eyes closed, using a refined sense of self to react to disturbances in real time. This isn’t a toy trick; it’s a core capability that underpins stability on tricky surfaces and in unpredictable environments.
Why vision-free locomotion matters
- Vision systems, as crucial as they are, have limits. Poor lighting, dust, smoke, or even sensor glitches can throw a robot’s navigation off course. By establishing a sturdy, sight-independent foundation for balance, Figure aims to keep robots mobile and safe even when vision fails.
- The practical payoff is obvious in dynamic workspaces like warehouses, factory floors, and retail settings. There, humans and machines share space, and a bot that can resist disturbances and regain footing quickly reduces the risk of injury or downtime.
- Beyond immediate portability, this approach also signals a broader industry shift: learned behaviors that scale. Instead of manually scripting every possible action, a single, improving AI model can guide many robots across fleets, lowering maintenance and deployment costs over time.
The hardware and the broader context
Figure 02 stands five feet six inches tall and is an all-electric platform built for human-centric environments. The ability to execute the Helix controller’s commands demonstrates a tight integration of software and hardware, where sensors, actuators, and control policies work in concert to create fluid, resilient movement.
This development comes as the humanoid robotics sector moves toward an AI-first paradigm. Competitors like Boston Dynamics have long set the standard for dynamic locomotion, but Figure’s emphasis on rapid, simulation-based learning and zero-shot transfer offers a scalable path to large fleets of robots all guided by a single evolving AI model. If the model keeps improving, a future where a broad set of humanoid workers operate with minimal calibration and supervision moves closer to reality.
Practical implications and future work
- Robust, vision-free locomotion could serve as a foundational layer for more complex tasks. A robot that can stay upright and move confidently in imperfect lighting, dusty warehouses, or smoke-filled environments becomes a safer collaborator—one that can handle the rough edges of real-world work without constant human tweaking.
- The approach also raises questions about safety, reliability, and transparency. If a single neural policy governs movement, how do engineers verify that it behaves as intended under every possible disturbance? Industry watchers will be watching how Figure addresses these questions as deployments scale.
- In the broader market, investors and customers are watching the pace of AI-first robotics. The idea of a general-purpose humanoid worker—capable across tasks with minimal hardware tweaks—moves from hype to a tangible, testable reality, driven by simulation, domain randomization, and a steady push of data back into the model.
Looking ahead
As these systems grow more capable, the line between science fiction and factory floor blurs. The workforce of the near future may include bots that don’t rely on cameras to stay balanced, adapting in real time to disturbances with reflexive, learned responses. That kind of agility, paired with scalable AI training loops, could redefine what’s possible in industrial automation—and what’s practical for everyday human-robot collaboration.
About Figure
Figure’s work sits at the intersection of AI research and commercial robotics. The company’s aim is to build practical humanoid agents that can operate in human-centric spaces, learning from simulations to perform real-world tasks with reliability and speed.
Sources
- https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQGyG_0kVfsX8n93uQQlJFGbeJX_985IHAImQo423NCp21nCWd27oyhghzLCcY_rA5SyLRY1PsyVUlZi1mzKX4Pyrn8C92-xfXAALqKo1C8I1Ex2ko40aJvcr1oiQH_evXvPsmJo1fIhfk4TaiFa3iFawZb1TQ==
- https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQGJQ4VL3neGF2y8LMo2Fq_cmzfnLC1E0q8CRJE7ztnkLMr2IeDmT40YiPh1JpuLV3c3NumP2iRFYzsRB5ZRkPq97e8btmQv-HZpKty06FLSdh9jpk81X_LdyHWbZ7QwNxClD07308CK3ZtQ_4dhSLYY-aXDMsdfxkiHQY_mkGYxdB_JlpMPfMRIvMdSJkPNZaZfH45EKNivkp24SZ1bz_Ozt5ww
- https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQEAYmnd_c6amN7Rd12wH6tMC6uuXLTdkzWoOyQ2EfufW_pOKx-4hvN6bVEdQt6KLcWtn1Oyo2M7-tdScRcoMn-XylAZVOb6Bw8gQ_bJy0w1EWGoi-SUNrMKVF1GlCrGCC9-MP4fAlm6DF5NK88p5qTU
- https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQGQ1T-0hDYQYep58jiolqkqDvn73FvkBC5fejOjoIaS2fZyNqwqzxpXh6tfCl69cygrGwWyP844mXiE4ZMnLmap1LyFarq5n9hnrDBYFqV3bPyWQUL9-ti778rY5my-UGSsPVH9M21Ll0KWPkoVZY7173yaceFKJDaKrVHl16YaLmSMhV8_lsdeZRC3Jjdds1agE7hivRBlbmyLQPqwy2iYmioym8FkwyQc_bX3hNPO
- https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQHlQZMBqYC0RdMquxkOTsceUipkUzFfHZrRXh7yIHYAcCrqJFMHFZ1F8nT3nA-lqAEiZxogK73EV3IO_97jMqj3IRyLyTTYLxCnOH_sYPeh4Vf_c8BN4Ykq8EiAUs4af_yqCu_eX7vfXrceJC6M9BhrBTArGAMId9KBS6iKsOJh8Z3ZHkgU_YuvLP2GLsS0OqczQzCcAvTht9yySi7vNbGOKT4EM7FGgQclg44=, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQGnfke2V8s2YnyHp_ChS0laTIsOGy9dTK-8L0uPGi2sWXnJSHADPcfJj5jJzqeH5SAdAkLTQs8xGYcYdfVIINq2RaPd2ArT8KOBZtdTk-goTFbXjEbkJEDzKNE35f-0eDVDDMg26XSzV9pnjMsNQhebq2JyVqRobLselA==, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQEm-7cmP-1Q90R2zdtGRqpUxZSHB-xVJBi2CYDYiRxBO7txN-lnptv7VLMqXIcrfPIZiZAOgE3U67OfFwISpUJkwYVDrpR9eh1CwsWzLkqOnI3KeuGUstCd74OmczVmVUwV8T2RpJljg8rpQuWSm4XYOG90ZnUiinR1f34P_zKVSOE=, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQG1iSO2dLBCg496RDrHerLfH2ImdUJgDx4gbU7-wqF6YoiqDnumkD-g2wZze3pmNvc7mNGVpuoz9aLifE1d3GgeIw7l6aqMatyo_Tm1GSyfPsQlERAA8VURG9EwS_Xhzv9D-rfoKGkDSPDmn-KyDX7BOmHKRbMH_MohqUZf4DjThjFztv6al7TTkTo=, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQE_sa29Bccsxs4YoFYbCVWWHmCtDmYY8fII4JOTDNultlTbEQsMkkag93NKnlEiSZKs2oPng0Y86VCgA0ogk84DtGUjGFGoLeLv1eTzWJCuQEhOjquOIhmiKfguA904qTVLy1_hAsLDxzkkrZm6-EB7EXCn7aye7MpEZhd22lHPqH_L50X0yoj62zn4wg==, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQFtDEHkCQazTk8IHB-pEQ6i-6V7dq7RHus2In6co_TOzYvQPSFkxGyKhA7RIhcNFhNbSsin0iA3sv4i6MSXtOfaqvnsKt_qdopNauILvQPyIDg0vG36oroeewfesw==, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQGrhbHi7KCBcTZ2iaorU0TqYEEHhrMjJK_6VK8Qp14a-ELZgad9rI_i-WJEldYSNdpXxNiKQ8C169jiuMBz4sT1VHw9VRNG1SGV9BGvvVb1nnpq7L6qmomuhxVHONTpZlrxqFqsku2urbwWSL39wFRCOCATAJiO_DzHUw==
- https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQEm-7cmP-1Q90R2zdtGRqpUxZSHB-xVJBi2CYDYiRxBO7txN-lnptv7VLMqXIcrfPIZiZAOgE3U67OfFwISpUJkwYVDrpR9eh1CwsWzLkqOnI3KeuGUstCd74OmczVmVUwV8T2RpJljg8rpQuWSm4XYOG90ZnUiinR1f34P_zKVSOE=, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQGnfke2V8s2YnyHp_ChS0laTIsOGy9dTK-8L0uPGi2sWXnJSHADPcfJj5jJzqeH5SAdAkLTQs8xGYcYdfVIINq2RaPd2ArT8KOBZtdTk-goTFbXjEbkJEDzKNE35f-0eDVDDMg26XSzV9pnjMsNQhebq2JyVqRobLselA==, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQEm-7cmP-1Q90R2zdtGRqpUxZSHB-xVJBi2CYDYiRxBO7txN-lnptv7VLMqXIcrfPIZiZAOgE3U67OfFwISpUJkwYVDrpR9eh1CwsWzLkqOnI3KeuGUstCd74OmczVmVUwV8T2RpJljg8rpQuWSm4XYOG90ZnUiinR1f34P_zKVSOE=, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQFtDEHkCQazTk8IHB-pEQ6i-6V7dq7RHus2In6co_TOzYvQPSFkxGyKhA7RIhcNFhNbSsin0iA3sv4i6MSXtOfaqvnsKt_qdopNauILvQPyIDg0vG36oroeewfesw==, https://vertexaisearch.cloud.google.com/grounding-api-redirect/AUZIYQGrhbHi7KCBcTZ2iaorU0TqYEEHhrMjJK_6VK8Qp14a-ELZgad9rI_i-WJEldYSNdpXxNiKQ8C169jiuMBz4sT1VHw9VRNG1SGV9BGvvVb1nnpq7L6qmomuhxVHONTpZlrxqFqsku2urbwWSL39wFRCOCATAJiO_DzHUw==