Elon Musk designed a more advanced prototype

Elon Musk designed a more advanced prototype

Tesla Bot Optimus

Let us know in detail, on this year’s AI Day we’ve got the latest updates on how Tesla’s AI takes on the Tesla bot world with amazing performance stats from the Optimus Dojo supercomputer and simulation tests of full self-driving.

Day 2022 started on last Friday and like previous years we are flooded with new technology.

The first Tesla bot called Optimus Vesting was the long-awaited performance of No Time CEO Elon Musk and his engineering team.

A fully self-supporting and walking human sized robot Elon did stuff he gave us a robot but what we didn’t see was an actual Tesla bot We just saw a bot made by Tesla.

What does Tesla Bot Optimus mean?

This first prototype was named the Bumble C which is another Transformer reference and Tesla was able to use off-the-shelf robotic parts in less than six months.

It’s nothing special as far as robots go, he said it was the first time the Bumble C ran without a support tether and it worked fine.

They literally produced a more advanced prototype that was sleeker and more streamlined in line with the Tesla bot design. Less wires and stuff this new guy was pulling out.

What he wasn’t able to do, says Elon, was that the Tesla bot was pretty close to production with this prototype, after that things got very technical on the robotic side, which he demonstrated for the movement internally.

The specifications and intricacies of the Tesla Bots hand actuators but the standout of the entire Optimus presentation was that section.

What did the Tesla team show everyone?

The Tesla team showed everyone how Optimus sees and navigates the world. Elon and his crew briefly remind the audience what they want Optimus to repeat.

For dangerous tasks in and around human-occupied spaces, the Tesla bot would need a navigation system that would rely on the same spatial cues as we do.

Luckily they already had a software, of course Tesla bot is going to use the same system as Tesla uses Pure Vision autopilot instead of lidar for their cars.

Like other contemporary robots, the Radar Tesla bot has three cameras that give it a perfect surround view, a central fisheye lens and two side view cameras, a field of view that appears to be a field of view slightly wider than 180 degrees.

Tesla’s computer vision bot offers shape recognition and edge detection so it can navigate and interact with objects in 3D space, and the video they show is doing a really great job.

How can a robot see the world?

We got a very brief look at how the robot can see the world from the beginning of the show, we can see how the bot recognizes and color codes the objects it recognizes that water can.

Obstacles like the desk, planter and walls are plain white. The floor is colored purple to show that the robot is aware of its surroundings and knows what it is seeing afterwards.

They show video of the bots camera input on top of a rendering of the 3D vector space that the computer uses to view the world alongside the cars. Camera video is instantly converted into three-dimensional digital space with a bird’s eye view.

We can see here that the bot is identifying those with green pixels and white obstacles, we can see how the bot darkens the objects that are closer to the dark color and the things that are more distant are shadows .

They show us the bot’s navigation layer by using edge detection to map the room and identify a clear path through other key points that could have been lost in all the glare.

What is Elon’s idea for this robot?

The technical demonstrations are the basic concept and ideas about building and scaling a Tesla bot using the first design platform, Elon’s idea for this robot that is supposed to be a home assistant.

So making it too expensive will not work. It’s not some technical performance like Boston Dynamics or Honda Robot Optimus, it’s meant to be mass-produced.

Just like Tesla was the first company to mass-produce electric cars, they wanted to be the first mass-market Droid maker as well as the lesson walked us through the design process.

First prototype in six months then second prototype in six months again designed for cost and manufacturing efficiency like Tesla’s cars and this is really the sticking point as it defines the whole optimum miss showcase.

Tesla engineers needed to design a working robot with an emphasis on affordability and utility, and they had to do it before any other company, many of which had years to start.

How Will Tesla’s Current GPUs Stack Up Against Computerization?

So they looked at their cars, they were already building these robots and found a way to put them on two.

Instead of flashing four wheels, legs have given Tesla a humanoid robot that can move under its own power and precisely explore its surroundings in a year.

So unless it’s doing your dishes Tesla’s Dojo supercomputer is already in development since last year’s AI Day and now we better understand how it stacks up against Tesla’s current GPU computerization.

When unveiled last year Tesla’s Dojo project was envisioned as an AI training tool, basically a computer that would run simulations based on real-world data.

The AI ​​will use them to train software that will let Tesla vehicles actually self-drive for the task Tesla is still using on the vast GPU farm.

Rack and rack of Nvidia graphics cards are struggling to process all the data and train the AI, but Nvidia GPUs aren’t really built for this specific task.

Easy Points:
  • GPUs are designed for use in a variety of situations and therefore have bottlenecks that slow down AI training.
  • For starters communication slows everything down in the cards I need to talk my memory.
  • They need to pass data between cards in the same rack, then they need to talk between the racks.
  • There is a lot of data traffic and the current technology is working, it is working very slowly.
  • Currently Tesla says it takes about a month to train for any specific driving scenario.
  • Which leads to a long wait for the results to be adjusted and tested again.
  • Just too long and so Tesla Engineers started designing a purpose-built supercomputer with two goals in mind density and scalability.

 

Leave a Comment