Time to drop Moore’s Law to advance computing, says researcher

Pro
Source: Stockfresh

12 April 2017

 

Dropping Moore’s Law is perhaps the best thing that could happen to computers, as it will hasten the move away from an aging computer architecture holding back hardware innovation.

That is the view of prominent scientist R Stanley Williams, a senior fellow in the Hewlett Packard Labs. Williams played a key role in the creation of the memristor by HP in 2008.

Moore’s Law is an observation made by Intel co-founder Gordon Moore in 1965 that has helped make devices smaller and faster. It predicts that the density of transistors would double every 18 to 24 months, while the cost of making chips goes down.

Predictable development
Every year, computers and mobile devices that are significantly faster can be bought with the same amount of money thanks in part to guidance from Moore’s Law. The observation has helped drive up device performance on a predictable basis while keeping costs down. But the predictions tied to Moore’s Law are reaching their limits as it becomes harder to make chips at smaller geometries. That is a challenge facing all top chip makers including Intel, which is changing the way it interprets Moore’s Law as it tries to cling on to it for dear life.

Williams is the latest to join a growing cadre of scientists who predict Moore’s Law is dying. The end of Moore’s Law “could be the best thing that has happened to computing in decades,” Williams wrote in a research paper published in the latest issue of IEEE Computing in Science and Engineering.

The end of Moore’s Law will bring creativity to chip and computer design and help engineers and researchers think outside the box, Williams said. Moore’s Law has bottled up innovation in computer design, he hinted.

So, what is next? Williams predicted there would be computers with a series of chips and accelerators patched together, much like the early forms of superfast computers. Computing could also be memory driven, with a much faster bus driving speedier computing and throughput.

Memory driven
The idea of a memory-driven computer plays to the strength of HPE, which has built The Machine along those lines. The initial version of The Machine has persistent memory that can be used as both DRAM and flash storage but could eventually be based on memristor, an intelligent form of memory and storage that can track data patterns.

Memory-driven computing could also break down the current architecture-based and processor-centric domination of the computer market. In the longer term, neuromorphic chips designed around the way the brain works could drive computing. HPE is developing a chip designed to mimic a human brain, and similar chips are being developed by IBM, Qualcomm, and universities in the US and Europe.

“Although our understanding of brains today is limited, we know enough now to design and build circuits that can accelerate certain computational tasks,” Williams wrote.

Applications like machine learning highlight the need for new types of chips. IBM has benchmarked its neuromorphic chip called TrueNorth as being faster and more power-efficient than conventional deep-learning chips like GPUs.

Williams suggested application-specific integrated circuits (ASIC) and field-programmable gate arrays (FPGA) could play a role in driving computing beyond Moore’s Law. These technologies will use superfast interconnects such as Gen Z, which was introduced last year and will be supported by major chipmakers and server makers like Dell and Hewlett Packard Enterprise.

Quantum computers are also emerging as a way to replace today’s PCs and servers, but are still decades away from running everyday applications.

 

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie