Intel Looks to a New Chip to Power the Coming Age of AI

The world's largest chip maker is building a chip just for machine learning as the biggest tech companies look to an AI-powered future.
Three dimensional abstract backgrounds block pattern
Getty Images

Microsoft researchers recently built an artificially intelligent system that seems to recognize conversational speech as effectively as a human. Yes, this research comes with caveats, but it's part of a very real and very rapid leap in artificial intelligence over the past several years, a leap driven by deep neural networks.

These sweepingly complex algorithms can teach themselves very particular tasks by analyzing vast amounts of data. Microsoft's system learned to recognize words by looking for patterns in old tech support calls. But it's not just the algorithms that are driving the recent revolution in AI. It's also the hardware behind these algorithms. Microsoft's speech rec system relies on large farms of GPU processors, chips that were originally designed for rendering graphics but have proven remarkably adept at running artificial intelligence models.

Internet giants like Microsoft, Google, Facebook, and Baidu typically train their deep neural nets using GPUs. But they're moving towards other, more specialized chips that can help accelerate not just the training but the execution of these systems. Google recently built its own AI processor. IBM is building another.

So Intel, the world's largest chip maker, is doing the same. Yesterday, the company unveiled a new AI processor called Nervana, saying it plans on testing prototypes by the middle of next year, and if all goes well, a finished chip will come to market by the end of 2017. At the moment, the market for AI chips is dominated by nVidia, the primary supplier of GPUs. But Intel is pushing to be a big part of this potentially enormous market in the years to come.

The new chip is based on technology originally built by a startup Intel acquired earlier this year, also called Nervana. "This really is the industry's first purpose-built silicon for artificial intelligence," says Intel corporate vice president Jason Waxman, who oversees the chips the company sells for big data centers like those operated by Google and Microsoft, explaining why the chip maker acquired the startup. That claims leaves out Google's TPU chip, IBM's True North, and perhaps others. But the Google chip isn't a commercial product---it's only used inside Google data centers---and the IBM chip has yet to reach the market, with some top AI researchers questioning whether it's suited to deep learning. In any event, the race is on to create the chips that will power an AI-dominated future.

Two Stages

Deep neural networks operate in two stages. First comes the training stage: a company like Microsoft feeds a neural net data that will enable it to perform a particular task, like speech recognition. Then comes the execution stage, when people actually use the neural net to, say, recognize commands spoken into smartphones. Intel's Nervana chip is designed to help with both stages, says Nervana founder Navene Rao, now an Intel vice president and general manager.

The chip is also designed to work not just with one type of deep neural networks, but with many. "We can boil neural networks down to a very small number of primitives, and even within those primitives, there are only a couple that matter," Rao says, meaning that just a few fundamental hardware ideas can drive a wide range of deep learning services.

Today, GPUs are still the most effective way of training AI systems, while companies are exploring all sorts of hardware for execution. Baidu executes with help from GPUs, for instance, while Microsoft uses programmable chips called FPGAs. Google went so far as to design its own chip, the TPU. But GPUs---originally design for other purposes---are far from ideal. "They just happen to be what we have,” says Sam Altman, president of the tech accelerator Y Combinator and co-chairman of open-source AI lab OpenAI. And not everyone has the resources to program their own chips, much less design them from scratch.

That's where a chip like Nervana comes in. The question is how effective it will be. "We have zero details here," says Patrick Moorhead, the president and principal analyst at Moor Insights and Strategy, a firm that closely follows the chip business. "We just don't know what it will do."

But Altman, for one, is bullish on Intel's technology. He was an investor in Nervana when it was a startup. "Before that experience, I was skeptical that startups were going to play a really big role in designing new AI," he told me last week, even before Intel announced its new chip. "Now I have become much more optimistic."

Intel certainly gives this technology an added boost. Intel chips powered the rise of PC and the data center machines that serve up the modern Internet. It has the infrastructure needed to build chips at scale. It has the sales operation needed to push them into the market. And after years as the world's dominant maker of data center chips, it has the leverage needed to get these chips inside the Internet's biggest players. Intel missed the market for smartphone chips. But it still has a chance with AI.