Optimus Gen 2

On December 12th, Tesla’s Optimus account on X.com dropped a holiday surprise on us in the form of a video showcasing the abilities of the next generation of the company’s bipedal robot.

The video shows a brief progression of the Tesla Bot from it’s debut with Bumblebee back in 2020, to the current version - which boasts a 30% walking speed boost over the Gen 1 version from March of this year, as well as a 10 kilogram weight reduction, and some impressive new updates to the neck, hands, and the now-human-shaped feet.

The weight reduction and the walking speed boost are impressive for such a short development time from the March Gen 1 - the extra controls in the toes lending the bot a much more steady walking pattern - but the truly impressive leaps in technology come from the neck and hands.

And we’re speaking mostly about the actuators here.

Tesla has made a big deal this year about moving the manufacturing of their actuators in-house - in fact back in March they said that the reason Optimus Gen 1 was operating so well compared to the 2020 debut model “Bumblebee” was because the team had decided to produce these delicate parts themselves rather than using off-the-shelf components.

Actuators are the bits that convert input signals into the actual force required to move anything from the Tesla bot’s legs to the fine motion of its fingers - and this is where Tesla gets to show off.

In the brief moments when we see the robot’s neck and fingers move for the first time, the subtitles tell us that the neck now has a 2-DoF actuated joint, and the fingers have a whopping 11-DoF.

DoF here refers to “Degrees of Freedom”, which in mechanics is referring to the axis on which an object can move. In the case of the new Neck joint, this means that the neck can now rotate left and right, as well as pitching forwards and yawing backwards.

Which means that those fingers are VERY articulated.

In fact, the motion of those fingers are now so smooth that Optimus developers felt they had to jump into the replies and confirm that this video wasn’t CGI, and hadn’t been sped up.

The really mind-blowing tech is, however, not even the actuators. It’s whatever Tesla is now using to give their robot some tactile sense in its fingers.

During the last few seconds of the video, we’re shown Optimus handling some eggs, with a visualization of the sensors in its fingers. Now, robots handle eggs all the time - but they usually do so with little tricks. They’re either programmed with the precise amount of force needed to pick up the egg without cracking it, or they’re designed to use soft grippers that use suction to hold the delicate object.

But Tesla went the more human way, and seems to have developed some sort of pressure sensor that allows Optimus to “feel” whatever it’s holding, and guage the correct amount of force to use with those delicate new actuators.

This is way closer to how humans actually interact with objects - with the nerves in our fingers telling us things like the shape, texture, and give of a surface, and then sending that information to our brains to process and make the decision of how much force to use when manipulating it.

And this is exactly how we’ve been told Optimus works. Back in March, when the Gen 1 was showcased, Tesla presenters made a big deal about their robot’s learning algorithms - that video of Optimus sorting blocks from a little later on in September was another example of it learning by doing - having only received orders to sort the blocks by colour, but not how.

We don’t know any details on this tactile sensing technology quite yet - if we ever will; it could very well be proprietary - but it could very well be used in the robot’s feet as well to help it gauge the surfaces it’s walking on - just like we do.

This is a gigantic technological leap from earlier this year, and CEO Elon Musk says that he believes Optimus will be able to thread a needle in a year's time. That might seem a bit overly ambitious, but it’s hard to state how big a deal giving Optimus tactile senses is. If the team can refine its finger controls a bit more, we could very well see a sewing demonstration in a couple of months.

Previous
Previous

3D Parking Assist Goes Live

Next
Next

FSD’s Day in Court