Optimus TeslaBot – Tesla AI Day 2022

Tesla unveiled their working Optimus TeslaBot prototype about 17 minutes and 26 seconds into their live stream Friday evening. Shortly after Elon reiterated the importance of labor in our economy — a concept he believes few truly understand.

https://youtube.com/watch?v=ODSJsviD_SU%3Fstart%3D1046
Optimus TeslaBot Prototype Unveiling

Elon further commented on the importance of Tesla being a public company, and as such, the public is able to control the future if we band together and vote in sync.

Optimus TeslaBot
Optimus TeslaBot

They have made progress, a prototype working within six months of last year’s announcement.

Optimus Timeline
Timeline

They expect to put the latest generation Bumblebee TeslaBot on the plant floor within a matter of months.

The live stream hit on the hard problems with bi-pedal and humanoid robotics detailing their approach to developing requirements by analyzing human performance.

Optimus TeslaBot – Battery and Processors

They have pulled battery technology from existing products, centralizing power distribution and computing to the physical center of the platform. Power management is accomplished in a single PCB handling sensing, fusing, charge management, and power distribution of 2.3 kWh, which should be sufficient for a full day’s work.

The central computer in the tesla is a modified Tesla full self-driving “computer.”

Optimus Design
Optimus Tesla Bot Specs

Optimus TeslaBot – Actuators

The design of actuators was accomplished by simulating tasks and their required torque-speed trajectories. Tesla was looking to optimize energy, mass, and cost on 28 actuators, leveraging powertrain design learnings from vehicle development. Engineers began identifying what the then unobvious tasks were at the actuator level through model simulations of human tasks such as turning and walking.

Task Simulation
Task Simulation

Once they obtained the torque-speed trajectories for these tasks they overlayed an efficiency map calculating power consumption and cumulative energy demands over time thereby developing a system cost.

Actuator Cost
Efficiency Map

They simulated, in the cloud, hundreds of actuators choosing the preferred actuators.

Cloud of Actuator Cost
Optimal Join Spec

Parsing the cloud and identifying every ideal joint spec.

System Cost
Optimal joint spec for all actuators

They then ran a commonality study to pare down the joint specs.

Commonality Study
Commonality Study

Ultimately identifying a portfolio of six mass-manufacturable actuators.

Actuators
Actuator Portfolio

Three rotary actuators and three linear.

Actuator Configuration
Rotary and Linear Actuators

Coverage of the hand/manipulators was sparse likely one of the least developed systems.

Hands
Hands / Grippers

In terms of software, they pulled from the tesla self-driving team for object recognition and task planning

Computer Vision
Computer Vision

I thought their use of computer vision to identify key points anchors change over time to estimate Optimus TeslaBot pose and trajectory for stabilization and planning was somewhat novel.

Navigation
Visual Navigation

Walking and Planning

The locomotion planning and control starts with the representation of Optimus TeslaBot kinematics, dynamics, and contact properties. The planner generates reference trajectories for the entire system working in three stages starting by planning footsteps and ending with the desired path and feasible center of mass trajectory. The bot uses toe-to-heal strides with a Zero Moment Point method.

Locomotion Planning
Locomotion Planning

As the motion planner uses an idealized model with simplified real-world assumptions, real-world unmodeled dynamics naturally destabilize the Optimus TeslaBot. As one would expect they stabilize via sensor feedback and state estimation to track the center of mass and stabilize.

Motion Control
Real World Motion Control

Manipulating

They created a library of natural motion references and mapped them to the Optimus TeslaBot using inverse kinematics.

Manipulating
Capturing Natural Motion References

And then generalized the natural motion references for the robot controller.

Motion Adaptation
Generalization

Creating “natural & useful” manipulation.

Manipulation
Manipulation

They closed by stating they are looking to get Bumblebee TeslaBot (the most recent generation) into their Freemont facility and are looking to make this product a reality in the coming months or years.

All-in-all, Optimus TeslaBot progress was greater than I expected with Tesla’s auto manufacturing learnings seemingly more applicable than I initially believed last year. In terms of what this could actually do in a plant, one would think the current state of Optimus, while mobile, is comparable to https://www.rethinkrobotics.com/ . Software remains the key to economic impact. Then again, much like Tesla’s approach to self-driving, they may see the real value materialize in the form of thousands or hundreds of thousands of units out in the real world streaming data back to base.