BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

IBM Continues To Invest Big In High-Performance Computing

Following
This article is more than 4 years old.

The Weather Company

IBM has a long and rich heritage of success in HPC (high-performance computing) markets and workloads. The first IBM HPC system I paid attention to was BlueGene, announced in 2004, which, according to IBM, was “built to help biologists observe the invisible processes of protein folding and gene development.” Hence the name. This was during my long stint at AMD and I remember asking my ex-Motorola colleagues about its PowerPC architecture. Fast forward to impressive CORAL wins for Summit and Sierra, and everything in-between, and it’s hard to ignore the influence IBM has in the market. There has been some speculation recently whether IBM has what it takes to keep innovating in the HPC markets—understandable given recent news out of the DOE that it chose Cray for El Capitan, Frontier, and Shasta. However, I think that line of thinking misses the big picture of what HPC is and where it’s going.

I see “HPC” as three markets, not just one:

  1. Large federal government-funded, semi-custom “funny cars” that occupy the top 5-10 in the TOP500. This is where CORAL implementations sit.
  2. Smaller, government-funded installations with mostly off the shelf components
  3. Enterprise systems used by financial and energy verticals which don’t show up on TOP500.

IBM has participated recently in #1 and #3, and a little bit of #2. I thought it would help to give you a flavor of some of the IBM successes in #2 and #3.

The Weather Company GRAF

Back at CES in January, IBM announced (in conjunction with its subsidiary The Weather Company) the IBM Global High-Resolution Atmospheric Forecasting System—GRAF for short. Utilizing IBM POWER9-based supercomputers, GRAF promises to deliver the world’s first hourly-updating commercial weather system capable of predicting weather events down to small local thunderstorms. IBM says GRAF will deliver around a 200% improvement in forecasting resolution across the globe—from an average resolution of 12 km down to 3 km. One interesting thing about GRAF is that it will incorporate previously untapped sensor data from aircraft into its models, as well as data from amateur weather stations. You can find more information here.

Barcelona Supercomputing Center (BSC)

While Lenovo gets a lot of news around BSC, it all started with IBM. IBM has had a longstanding partnership with the Barcelona Supercomputing Center, and they’re currently involved in a number of ongoing HPC research projects together. These include:

  • an exploration of the use of high-performance in-memory databases for the Blue Gene Active Storage architecture (which I mentioned earlier)
  • an exploration in applying software-defined environments (SDE) to HPC workloads
  • work on cognitive deep learning with HPC tools (particularly as it relates to image processing)
  • adaptive resource management for IBM Power architectures

The BSC lists these and other “current projects” you can read more about here.

Eni

Another interesting partnership of IBM’s is with Eni, a global energy company. Together, the companies are collaborating on an AI platform to help drive Eni’s decision-making during the first stages of hydrocarbon exploration. Hydrocarbon exploration is a complicated endeavor that takes into account a large amount of geological, geochemical, and physical data in order to decide, essentially, where there are likely to be hydrocarbons and where to drill and explore further. AI aids in the contextualization and presentation of this information. IBM calls the process of mining all of this preexisting data for insights “cognitive discovery.” You can read more about this here.

Total

A few months ago, IBM announced the construction of a new IBM POWER9-based supercomputer called Pangea III for the global energy company Total. IBM touts Pangea III as “the world’s most powerful commercial computer,” with 25 petaflops of compute power and 50 petabytes of storage capacity. The supercomputer is purportedly the #11 system in the Top500. Total says it will put Pangea III to use on:

  • higher resolution seismic imaging for exploration and development
  • the generation of predictive production models
  • asset valuation and selectivity

You can read more about this here.

U.S. Army

Datacenter Dynamics reported in August that the US Army bought a $12M “supercomputer” to be used by multiple agencies inside the Department of Defense. The most interesting thing about this implementation is that it’s inside a pod with chilled water that can be moved anywhere a C-130 could take it. The 6-petaflop machine reportedly uses 148 nodes, each with a dual-socket IBM POWER9 processor. The Department of Defense doesn’t like to discuss workloads or applications, but you can bet given the NVIDIA V100 and T4s, it’s a machine learning application.

Quantum HPC future?

Many pundits thought IBM was crazy in the way the company designed its recent POWER designs, namely with its “Swiss Army knife of acceleration” approach. When IBM had to make the bet on CAPI, PCIe Gen4, and NVLink, GPU and FPGA acceleration and shared memory wasn’t even a thing. Now, GPU accelerated computing is the rage and FPGAs are gaining steam. IBM saw the future, bet on it, and was right, hence the success in those accelerated applications.

Moor Insights & Strategy

In this same way, I see IBM doing this with its Quantum efforts.  Moor Insights & Strategy analyst Paul Smith-Goodson this month outlined where he sees quantum computing making huge impacts:

  • New chemicals, drugs, and materials can be modeled, modified, and designed with custom properties to develop new pharmaceutical, commercial, and business products.
  • Today we use supercomputers for a variety of optimization problems, such as Monte Carlo simulations, energy applications, and bond prices. Quantum computers will allow for more robust simulations and on a much larger scale to provide more in-depth insight, higher efficiency, and better forecasting.
  • The combination of quantum computing and artificial intelligence is almost a scary thought. Artificial intelligence might become orders of magnitude smarter than it is today. CORAL was very clear that its direction isn’t a FLOPS monster, but FLOPs plus AI.

Essentially what Smith-Goodson is talking about here is the next generation of HPC. Dr. Jeffrey Welser, Vice President of IBM Research, gave the keynote address at SEMICON West. He said it will take another three to five years to develop a mid-scale quantum computer but also acknowledged it would take another 10 to 15 years until we realize the real benefits of quantum computing. With the IBM Q Experience, researchers can get access to quantum devices and simulators to get started now.

IBM, without a doubt, is one of the leaders or the leader in quantum computing and in what we believe at Moor Insights & Strategy is part of the future of HPC.

In closing

I am hopeful you can see a few things about the HPC market and IBM’s participation in it. One is that the HPC market is more than a top 10 TOP500 market and is rapidly expanding into enterprise financial, energy, and pharma applications. Secondly, that IBM has some of the technology in areas like Quantum that could get it right back into the new top 10 in addition to better servicing the smaller TOP500 and enterprise HPC applications. While we have to look at the recent top 10 TOP500 setbacks, I think it’s more productive to look at IBM’s HPC future which is bright based on its 40-year quantum investment.

Follow me on Twitter or LinkedInCheck out my website or some of my other work here