Tesla unveiled their working Optimus TeslaBot prototype about 17 minutes and 26 seconds into their live stream Friday evening. Shortly after Elon reiterated the importance of labor in our economy — a concept he believes few truly understand.
Elon further commented on the importance of Tesla being a public company, and as such, the public is able to control the future if we band together and vote in sync.
They have made progress, a prototype working within six months of last year’s announcement.
They expect to put the latest generation Bumblebee TeslaBot on the plant floor within a matter of months.
The live stream hit on the hard problems with bi-pedal and humanoid robotics detailing their approach to developing requirements by analyzing human performance.
Optimus TeslaBot – Battery and Processors
They have pulled battery technology from existing products, centralizing power distribution and computing to the physical center of the platform. Power management is accomplished in a single PCB handling sensing, fusing, charge management, and power distribution of 2.3 kWh, which should be sufficient for a full day’s work.
The central computer in the tesla is a modified Tesla full self-driving “computer.”
Optimus TeslaBot – Actuators
The design of actuators was accomplished by simulating tasks and their required torque-speed trajectories. Tesla was looking to optimize energy, mass, and cost on 28 actuators, leveraging powertrain design learnings from vehicle development. Engineers began identifying what the then unobvious tasks were at the actuator level through model simulations of human tasks such as turning and walking.
Once they obtained the torque-speed trajectories for these tasks they overlayed an efficiency map calculating power consumption and cumulative energy demands over time thereby developing a system cost.
They simulated, in the cloud, hundreds of actuators choosing the preferred actuators.
Parsing the cloud and identifying every ideal joint spec.
They then ran a commonality study to pare down the joint specs.
Ultimately identifying a portfolio of six mass-manufacturable actuators.
Three rotary actuators and three linear.
Coverage of the hand/manipulators was sparse likely one of the least developed systems.
In terms of software, they pulled from the tesla self-driving team for object recognition and task planning
I thought their use of computer vision to identify key points anchors change over time to estimate Optimus TeslaBot pose and trajectory for stabilization and planning was somewhat novel.
Walking and Planning
The locomotion planning and control starts with the representation of Optimus TeslaBot kinematics, dynamics, and contact properties. The planner generates reference trajectories for the entire system working in three stages starting by planning footsteps and ending with the desired path and feasible center of mass trajectory. The bot uses toe-to-heal strides with a Zero Moment Point method.
As the motion planner uses an idealized model with simplified real-world assumptions, real-world unmodeled dynamics naturally destabilize the Optimus TeslaBot. As one would expect they stabilize via sensor feedback and state estimation to track the center of mass and stabilize.
Manipulating
They created a library of natural motion references and mapped them to the Optimus TeslaBot using inverse kinematics.
And then generalized the natural motion references for the robot controller.
Creating “natural & useful” manipulation.
They closed by stating they are looking to get Bumblebee TeslaBot (the most recent generation) into their Freemont facility and are looking to make this product a reality in the coming months or years.
All-in-all, Optimus TeslaBot progress was greater than I expected with Tesla’s auto manufacturing learnings seemingly more applicable than I initially believed last year. In terms of what this could actually do in a plant, one would think the current state of Optimus, while mobile, is comparable to https://www.rethinkrobotics.com/ . Software remains the key to economic impact. Then again, much like Tesla’s approach to self-driving, they may see the real value materialize in the form of thousands or hundreds of thousands of units out in the real world streaming data back to base.