Advertisement
U.S. markets closed
  • S&P 500

    5,254.35
    +5.86 (+0.11%)
     
  • Dow 30

    39,807.37
    +47.29 (+0.12%)
     
  • Nasdaq

    16,379.46
    -20.06 (-0.12%)
     
  • Russell 2000

    2,124.55
    +10.20 (+0.48%)
     
  • Crude Oil

    83.11
    -0.06 (-0.07%)
     
  • Gold

    2,254.80
    +16.40 (+0.73%)
     
  • Silver

    25.10
    +0.18 (+0.74%)
     
  • EUR/USD

    1.0792
    -0.0001 (-0.01%)
     
  • 10-Yr Bond

    4.2060
    +0.0100 (+0.24%)
     
  • GBP/USD

    1.2613
    -0.0009 (-0.07%)
     
  • USD/JPY

    151.3300
    -0.0420 (-0.03%)
     
  • Bitcoin USD

    70,261.20
    -476.28 (-0.67%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,952.62
    +20.64 (+0.26%)
     
  • Nikkei 225

    40,369.44
    +201.37 (+0.50%)
     

Intel and Facebook Take Aim at NVIDIA

Over the past several years, the adoption of artificial intelligence (AI) has made countless headlines, and the number of uses for the data-mining and pattern-recognizing capabilities of AI has exploded. Chip-maker NVIDIA (NASDAQ: NVDA), which pioneered the graphic processing unit (GPU), was the biggest beneficiary of the adoption of AI. The parallel processing capability of its GPU, which brought about a revolution in image rendering, was a surprisingly good fit for the process of training AI systems.

NVIDIA was able to capitalize on that market, supplying its GPUs to data scientists, researchers, and data centers. Its stock has grown more than five-fold in the past three years, going from about $27 per share in early 2016 to over $140 per share today -- and that's after losing half of its value during the year-end market rout.

That level of success breeds competition, and Intel (NASDAQ: INTC) and Facebook (NASDAQ: FB) have joined forces in an effort to dislodge NVIDIA from its leadership position.

A circuit board featuring a Intel Nervana processor.
A circuit board featuring a Intel Nervana processor.

Intel's Nervana neural network processor. Image source: Intel.

Chipping away at the competition

At the Consumer Electronics Show in Las Vegas this week, Intel said it was working with Facebook to develop the Nervana Neural Network Processor for Inference (NNP-I). The company said this new class of processor will cater to the needs of those high workload demands by accelerating the practice of inference. The processor is scheduled to go into production this year. Intel is also working on a Neural Network Processor for Training -- which it code-named Spring Crest -- which it expects to be available later this year.

For the uninitiated, AI processes occur in two very distinct stages: training and inference. The training phase involves developing the algorithms and computer models necessary to complete a specified task, such as language processing or image recognition. This phase of the operation is computationally intensive, which is what initially attracted researchers to the GPU. The second phase, known as inference, occurs after the system has been programmed with the necessary data and is working on the task it was trained for -- such as tagging friends in photos.

Intel initially announced its partnership with Facebook in late 2017. In a blog post at the time, Intel CEO Brian Krzanich said, "We are thrilled to have Facebook in close collaboration sharing its technical insights as we bring this new generation of AI hardware to market." Intel planned to deliver the initial version of the chip to Facebook and other partners for testing and feedback before embarking on the sophomore version.

Building a better mousetrap

Intel isn't the only company looking to cash in by developing a better chip developed specifically for AI uses. Google, a subsidiary of Alphabet (NASDAQ: GOOGL) (NASDAQ: GOOG), developed the tensor processing unit (TPU), which is currently on its third generation. Google has been using the processor internally and has not made it available for sale, but rather used the chip to power its Google Cloud.

Amazon (NASDAQ: AMZN), an early pioneer and current leader in cloud computing, announced late last year at its Re:Invent conference that that Amazon Web Services (AWS) had developed the Inferentia chip, designed to provide high performance when making predictions at an extremely low cost. The inference phase of AI operations results in 90% of the expenses, prompting Amazon to focus on the cost savings. The company called the resulting processor "a game changer," but it doesn't plan to bring the chips to market, instead only using them internally for its cloud-computing customers.

The business ramifications

Fears of slowing growth resulting from declining cryptocurrency-related sales and a marked deceleration in data-center revenue -- which also includes processors used for AI -- had investors selling off NVIDIA shares. That combined with the end-of-year correction sent NVIDIA into freefall, declining by 50%.

It's important to note that even though a host of other companies are working to create a better solution for AI systems, they have yet to best the humble GPU. With NVIDIA stock available at fire-sale prices, now might be the time to initiate or add to a position -- I did just that, adding shares last month.

More From The Motley Fool

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Danny Vena owns shares of Alphabet (A shares), Amazon, Facebook, and Nvidia. The Motley Fool owns shares of and recommends Alphabet (A and C shares), Amazon, Facebook, and Nvidia. The Motley Fool has a disclosure policy.

Advertisement