NVIDIA chief executive officer (CEO) Jensen Huang on Friday (Mar 1) said artificial general intelligence could – by some definitions – arrive in as little as five years.
Huang, who heads the world’s leading maker of artificial intelligence (AI) chips used to create systems like OpenAI’s ChatGPT, was responding to a question at an economic forum held at Stanford University about how long it would take to achieve one of Silicon Valley’s long-held goals of creating computers that can think like humans.
Huang said the answer largely depends on how the goal is defined. If the definition is the ability to pass human tests, Huang said, artificial general intelligence (AGI) will arrive soon.
“If I gave an AI… every single test that you can possibly imagine, you make that list of tests and put it in front of the computer science industry, and I’m guessing in five years time, we’ll do well on every single one,” said Huang, whose firm hit US$2 trillion in market value on Friday.
As of now, AI can pass tests such as legal bar exams, but still struggles on specialised medial tests such as gastroenterology. But Huang said that in five years it should also be able to pass any of them.
But by other definitions, Huang said, AGI may be much further away, because scientists still disagree on how to describe how human minds work.
“Therefore, it’s hard to achieve as an engineer” because engineers need defined goals, Huang said.
Huang also addressed a question about how many more chip factories, called “fabs” in the industry, are needed to support the expansion of the AI industry. Media reports said OpenAI CEO Sam Altman thinks many more fabs are needed.
Huang said more will be needed, but each chip will also get better over time, which acts to limit the number of chips needed.
“We’re going to need more fabs. However, remember that we’re also improving the algorithms and the processing of (AI) tremendously over time,” Huang said. “It’s not as if the efficiency of computing is what it is today, and therefore the demand is this much. I’m improving computing by a million times over 10 years.” REUTERS