In a world enthralled with AI and in love with NVIDIA’s founder/CEO Jensen Huang for making the chips that make it happen, NVIDIA’s stock is gold.
At the company’s annual developer conference, GTC 2024, held earlier this year, Huang made it quite clear: he was betting all his chips on humanoid robots.
In an Acquired podcast, Huang is on top of the world yet humble enough to give credit to OpenAI for the AI revolution. He only takes credit for positioning NVIDIA near enough to AI technology so that when it bloomed, his company and his company alone would have the chips available for it.
“You want to be near where the apple falls,” says Huang. Even if I’m not sure what and when I position the company to be near the tree, we may have to do a diving catch for it. Even if you don’t catch the apple before it hits the ground, you want to be the first one to pick it up.”
AI was not the first apple Huang picked up. Before AI, he was equally gung-ho about autonomous vehicles. Previous GTC’s featured vehicles emblazoned with NVIDIA logos, their backs filled with GPU-laden computers. A massive TuSimple tractor cab was on the show floor, and their CEO was on stage. AVs needed enormous computing power on the spot. There was no time to bounce signals off remote servers. The lag caused by accessing far-off servers could kill.
However, when an AV did kill, as it did on March 18, 2018, in Tempe, Arizona, the fallout caused Uber to abandon its plans for AVs. TuSimple laid off 75% of its US workforce and abandoned its plans to automate the truck industry. The auto industry and investor community lost interest. There was not to be a world full of people being driven around by their vehicles any time soon, nor the windfall NVIDIA may have expected.
AI may not be as intelligent as we would like it to be, but since it hasn’t killed anyone yet, the AI fever continues unabated. Because of it, NVIDIA’s revenue, profits and valuation are through the roof.
What’s Next?
Where’s the next apple going to fall? At the company’s annual developer conference, GTC 2024, held earlier this year, Huang made it quite clear: humanoid robots. And once again, he had the hardware for it. NVIDIA’s hardware, and only NVIDIA’s hardware, would allow humanoid robots to finally hit their stride.
Humanoid robots have been the stuff of sci-fi movies for a century. Fritz Lang’s 1927 Metropolis starred a humanoid robot. Since then, most robots have depicted robots as killers, culminating in Terminator. Sex robots caused a brief sensation. Disney kept the dream alive with robots in their theme parks, but robots for home use were mainly novelty items. We had all resigned ourselves to evil robots in the movies and good robots on the assembly lines.
Then came GTC 2024 and Jensen Huang on the main stage… and materializing from the darkness behind him an entire chorus line of robots. Were they about to dance and high kick like Dallas Cowboy cheerleaders? I prayed not. They were far too close to Huang. I could not imagine NVIDIA without Jensen.
I was, in my moment of panic, missing the point Huang was making: Humanoid robots were going to be the next big thing. And luckily for us all, the humanoid robots remained safely stationary. The only robot action was with a harmless, knee-high robot on wheels that responded to Huang’s commands about as much as my pug dog.
What Would Jensen Do?
Since Jensen Huang has shown a Midas touch with AI, he easily commands our attention.
NVIDIA’s interest in humanoid robots is driven by its vision for a future where robots can seamlessly operate in human environments. CEO Jensen Huang explains that since the world is structured around human needs—like factories designed for human workers—humanoid robots are well-suited to perform tasks in these spaces. By training robots with data based on human movements and behaviors, NVIDIA aims to create machines that can learn and adapt to complex tasks autonomously, ranging from assembly line work to complex navigation.
This aligns with NVIDIA’s broader mission to advance embodied AI and autonomous robotics, areas they believe will not only support industrial productivity but also create jobs. Through projects like Project GR00T and platforms like NVIDIA Isaac, NVIDIA is building a comprehensive toolkit for developing and training humanoid robots. This approach allows these robots to better understand and perform human-like tasks, which Huang sees as foundational to the future of robotics and artificial intelligence. NVIDIA’s investments also capitalize on the potential of AI to handle global challenges, such as labor shortages and industrial efficiency, positioning the company as a leader in the robotics and AI sectors.
Holoscan and Machine Vision
NVIDIA Holoscan is a comprehensive AI sensor processing platform developed by NVIDIA that is tailored for the low-latency processing of sensor data. It includes optimized libraries for AI data processing and offers core microservices for managing streaming, imaging, and networking applications. Holoscan supports embedded devices, edge computing and cloud platforms.
NVIDIA Jetson is an edge AI and robotics platform designed to provide high-performance computing for applications in robotics, autonomous machines and AI at the edge. It includes compact, powerful modules and a software development kit (NVIDIA JetPack).
NVIDIA offers Jetson for products that need object recognition, autonomous navigation, and complex data processing. It integrates with NVIDIA’s Holoscan, Metropolis, and Isaac, which simplify development for applications in industrial automation, smart cities, healthcare and more.
The “eyes” of humanoid robots are sensors and processors like OMNIVISION’s OG02B10 image sensor and OAX4000 processor. The sensor captures visual data, much like a human eye, enabling robots to see their surroundings in precise, high-resolution detail, even in motion. Meanwhile, the OAX4000 processor acts like the visual cortex, interpreting this data by enhancing image quality, managing multiple camera inputs, and processing it for the robot’s AI system to make decisions.
The ”brain” in a humanoid robot is typically the AI and processing unit, which interprets sensor data, makes decisions and drives the robot’s actions. In robots built on NVIDIA platforms, this is often handled by processors like the NVIDIA Jetson or more advanced AI processors such as NVIDIA’s Thor and Project GR00T. These processors are designed to run complex AI algorithms and machine learning models that allow the robot to “think” and respond in real-time.
Yes, NVIDIA Jetson and similar processors in humanoid robots are considered edge computers. Edge computing refers to processing data close to where it is generated (at the “edge” of the network), which reduces latency and enhances responsiveness. This approach is crucial for real-time applications like robotics, where immediate processing is needed for tasks like obstacle avoidance, facial recognition, and decision-making.
For instance, NVIDIA Jetson modules are built to perform AI and deep learning computations directly on the device, without relying on remote cloud servers. This capability makes them ideal for powering the “brain” of a humanoid robot, enabling it to process sensor data from machine vision components quickly and act in real time.
For example, NVIDIA Jetson is equipped with powerful GPUs and AI acceleration capabilities that enable the robot to process visual and environmental data from vision sensors. This information is then used by the AI models running on Jetson or another processor to interpret surroundings, plan movements, and interact with humans in a responsive manner.
The Market for Humanoid Robots
The humanoid robot market is expected to experience rapid growth over the next decade, driven by advances in artificial intelligence, robotics, and increasing demand across industries such as healthcare, personal assistance, and retail. The global market, valued at around $1.5 billion in 2023, is projected to expand significantly, reaching up to $66 billion by 2032, with a compound annual growth rate (CAGR) ranging from 34% to 62%, depending on the market segment and region.
Aging populations and the need for automation in caregiving are major growth drivers, as humanoid robots are increasingly used for companionship, healthcare assistance, and rehabilitation. In industries like retail and hospitality, robots enhance customer service through tasks like guiding, providing product information, and handling transactions. In addition, regions like Asia-Pacific, led by countries such as Japan and South Korea, are at the forefront of adopting humanoid robots, driven by strong governmental support and robust tech infrastructure.
However, challenges remain, such as high costs, security from hacks, and safety, especially as humanoid robots integrate with humans. But with mass production and technical advances, the humanoid robot market is bound to advance — as long as they don’t kill someone.