Subtitles section Play video
We're here at Nvidia GTC.
Where the company is showing off a suite of technologies that are designed to help robots learn how to be robots.
Let's get into it.
Let's.
Let's talk about robots.
One often repeated phrase at this year's Nvidia GTC was physical AI.
The kind of AI that allows robots to operate effectively in the physical world.
To help get there, Nvidia is offering its Isaac Groot N1.
An open source foundation model that is pre-trained with some of the basics and is broken down into a fast system for taking action and a slow system for planning action.
The next step is teaching the robot new tasks, something that requires a large and diverse data set, especially if the robot will be required to do the learned task in a variety of different conditions.
To learn how Nvidia is helping robots get there, we spoke with Akhil Docca, senior product manager for Nvidia's Omniverse.
What you're seeing here is essentially the policy training where it's actually learning how to drop something in there.
This was actually trained using Teleam, so this particular version of what you see was actually done via Teleam.
Where somebody donned on a headset, we do some human demonstration and we scale that using synthetic data.
The gathering of that synthetic data can be achieved through Nvidia's Omniverse with Cosmos.
And Akhil helped me understand what each of those things has to offer.
Omniverse is platform that brings together data from different sources and allows you to build something like a digital twin, which is physically accurate.
The data that comes out from Omniverse can go through Cosmos.
Cosmos is really important because it allows not only the augmentation of data for photo realism to reduce the simulation real gap, but it also allows you to create an exponentially large data.
So I can go from maybe a few thousand images to a million images and more, and really that data diversity is important because then it helps you the model in a better way and helps generalize it.
Going from training around specific tasks to training that results in a more general set of capabilities or behaviors is one big challenge that the whole humanoid robotics industry seems to be working on right now.
To make that synthetic data as applicable to the real world as possible, requires a detailed representation of the laws of physics.
Today, we're announcing something really, really special, it is a partnership of three companies, Deep Mind, Disney Research and Nvidia, and we call it Newton.
Wong brought out a remote controlled BDX droid to celebrate the new physics engine and announced that like Isaac Groot N1, Newton would also be open source.
That's why Newton is open source because we want to be able to not only give the developers this amazing, you know, work that we're doing with Google Deep Mind and Disney, but also they can contribute back to it.
The backdrop of much of this work on humanoid robots is a looming anxiety about labor shortages.
We know very clearly that the world is has severe shortage of of human laborers, human workers, by the end of this decade, the world is going to be at least 50 million workers short.
We'd be.
To get more.
I reached out to Nvidia who told me the predicted shortage of 50 million workers by the end of the decade was based on a combination of estimates from various sources across various industries and locations.
A representative from Nvidia also clarified the cost model, saying quote, it's more like subscribing to full self-driving services for an autonomous vehicle.
The customer buys the car and then would pay for the services it wants on top.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
We.
For more robot videos, you can check out this playlist right here and subscribe to CNET for the latest