Boston Dynamics’ Atlas Robot Learns to Understand Its Environment

Check out the Atlas robot from Boston Dynamics. It’s like a real-life superhero. Atlas can see its surroundings in real-time, making its actions super precise.

What’s new with Atlas?

Atlas is now all-electric and can perform tasks in a lab setting. Its mission is to pick up engine parts and move them around. The robot constantly updates its understanding of the environment to manipulate the parts effectively. It even analyzes the shape and topology of each part to decide how to handle it.

In a video, Atlas is seen reacting to a dropped engine part. It’s like it “heard” the part fall. Then, it visually searches for the part, identifies it, and picks it up with precision.

Atlas in action

According to Scott Kuindersma, senior research director at Boston Dynamics, the robot’s behavior is impressive. “In this demonstration, the search behavior was triggered manually. The robot doesn’t detect the sound of the part falling. It autonomously finds the object on the floor.”

How does Atlas work?

When an object is in Atlas’s field of view, it uses a pose estimation model to understand the object’s position and shape. This model is trained with large-scale synthetic data and can generalize to new objects without needing retraining.

The system refines its estimate of the object’s pose by comparing the rendered model with the camera image. It can even start with a 2D estimate and refine it to 3D. This approach is reliable for hundreds of industrial assets already modeled by Boston Dynamics.

The future of robotics

Vision-based guidance in robots has been around since the 90s. But the big difference now is mobility. Any mobile manipulator robot needs to constantly update its map of the world.

Modern robotic vision uses visual language models (VLM) to understand the world through the camera. Old industrial robots were fixed and relied on 2D vision and complex calibration. Atlas, being mobile, understands the world in 3D and continues tasks even when the environment changes.

The Boston Dynamics demo combines AI-based functions (like perception) with procedural programming to manage the mission. It’s a great example of software evolution. For these systems to work in the real world, they need to handle both subtle and significant changes in the environment.

Atlas in motion

As we watch Atlas move, its actions are fascinating. The robot’s decision-making process is still not fully visible, but we see it pause, process, decide, and move. With time, this processing time will likely reduce as the code and AI models evolve.

The current goal is to develop AI-based software that allows robots to adapt, understand their surroundings, and learn continuously from multimodal data.

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here