I’m in San Jose’s McEnery Convention Center, once again home for NVIDIA’s GTC conference. It’s good to be back. GTC took a few years off, as did we all, during COVID. Now it is back with a vengeance, drawing 25,000 attendees. Many are boasting about their involvement with AI, and quite a few are wearing it on their person.
AI is, of course, the fashion. The area is well known for its computer revolutions, starting from Hewlett Packard’s start in the now famous garage. Since then, revolutions have been financed by venture capitalists. I drove past Sand Hill Road to get here, home of many of the VCs, from my home in Marin County, north of all the action and largely ignored – unless you count Autodesk.
NVIDIA, however, is the bullseye of the current AI revolution. AI is the word on everyone’s lips. No conversation in these parts goes more than a minute without mentioning AI. If the billboards on 101 from SFO don’t let you know, the GTC conference will definitely remind you.
If I were Asian, in the Bay Area and not in AI, I would have serious FOMO*. Jensen Huang may be their patron saint. Of Asian heritage, the leather-jacketed Jensen is really everybody’s favorite right now. He is the brightest star in the sky full of stars. The next brightest: Sam Altman, the founder and CEO of OpenAI, and the George Washington of new AI nation with ChatGPT. Altman is not here, as far as I can tell.
Jensen practically owns the hardware side of AI. All software, including AI, needs hardware to run on, GPUs in particular, and here is NVIDIA to supply it. AMD makes GPUs as well is trying very hard to have the world notice.
NVIDIA’s success is somewhat of a happy accident. NVIDIA was not initially about AI. It was about gaming. Jensen saved NVIDIA from the brink of bankruptcy by creating a chip that made gaming better. You heard that right. Nvidia’s GPUs were able to make gamers enjoy games like Grand Theft Auto. He alone may have decided that gamers were suffering from lousy graphics and set about to make chips and graphic cards with those chips on them so their games would run better. It was a niche market.
Take the runaway hit that was Pacman. What could have worse graphics? Nevertheless, that pixelated object became a phenomenon. There was Space Invaders before that. Horrible graphics, no question. Point is: Even though gamers were happy with Pacman, it took someone like Jensen to say: Hey, look. You could do a whole lot better. Let me give you a chip that will change your world.
Then came the GPU, a graphical processing unit. Unlike the CPU (central processing unit) it did its functions in parallel. CPU (central processing unit) and the living heart of most computers works in series, able to attend to one event after the other.
Though late to create applications for serious computing, the GPU went on to steal the show. Long after the business world embraced Windows, Word and Excel came AI. The world changed almost overnight. The CPU quickly became yesterday’s news. The GPU was all everybody wanted. The chip created for gaming was surprisingly good at AI, another application that can take advantage of parallel processing.
Huang admits to being fortunate to have to have the right chip at the right time.
“You have to be where the apple is falling so you can catch it,” he has been known to say.
So when AI became the next big thing, Nvidia had more GPUs than everyone else combined. And now we have NVIDIA, a chip manufacturer, practically the face of AI.
—————
*Fear of missing out