Will future computers rely more on memory than processors? HP Enterprise thinks so. Meet The Machine.

CEO Meg Whitman on Monday in London touted what she says is the future of computing in a world in which “everything computes” with machines that are “memory driven” rather than processing. The technology has the potential to provide a “quantum leap in compute performance and energy efficiency,” notes The Register, a U.K. tech news web site.

Whitman takes the hype a bit higher.

“We believe this has the potential to change everything,” she declared in a blog post.

“We have built the first proof-of-concept prototype that illustrates our Memory-Driven Computing architecture, a significant milestone in The Machine research program we announced just over two years ago.”

Running on Linux, The Machine relies on photonics (optical) and system-on-a-chip tech to speed things up. And one major thought leader sees The Machine’s potential.

“Every scientist knows that the amount of data we have to deal with has been exploding exponentially – and that computational speeds are no longer keeping up,” pointed out California Institute of Technology Theoretical Physicist Sean Carroll in a Q&A posted by HPE (He’s also an advisor to The Big Bang Theory, by the way.) “We need to be imaginative in thinking of ways to extract meaningful information from mountains of data. In that sense, Memory-Driven Computing may end up being an advance much like parallel processing – not merely increasing speed, but also putting a whole class of problems within reach that had previously been intractable.”

The technology breakthrough could pose a real challenge for competitors in the server industry, including Lenovo and IBM. And what could this mean for personal computing? (HPE focuses on servers; the split of HP saw the PC group included in HP Inc., the top threat to No. 1 PC maker Lenovo. Will HPE and HP Inc. be sharing “The Machine” breakthrough?)

  • VIDEO: Watch an overview of The Machine” project at: https://www.youtube.com/watch?v=2VG59FYkPdM

In the emerging Internet of Things galaxy, Whitman said by 2020 that “20 billion connected devices will generate record volumes of data” plus even more “information that we expect will be generated by the growing internet of industrial assets.” And traditional technology can’t keep pace, she stressed.

“Even if we could move all that data from the edge to the core, today’s supercomputers would struggle to keep up with our need to derive timely insights from that mass of information. In short, today’s computers can’t sustain the pace of innovation needed to serve a globally connected market.”

Thus, The Machine.

Positive reaction

Here’s how The Register sums up The Machine:

“The Machine is being touted as a memory-driven computer in which a universal pool of non-volatile memory is accessed by large numbers of specialized cores, and in which data is not moved from processor (server) to processor (server), but in which data can stay still while different processors are brought to bear on either all of it or subsets of it.

“Aims include not moving masses of data to servers across relatively slow interconnects, and gaining the high processing speed of in-memory computing without using expensive DRAM. [From Wikipedia: Dynamic random-access memory (DRAM) is a type of random-access memory that stores each bit of data in a separate capacitor within an integrated circuit.] The main benefit is hoped to be a quantum leap in compute performance and energy efficiency, providing the ability to extend computation into new workloads as well as speed analytics and HPC and other existing workloads.”

Whitman pointed out that “by putting memory, not the processor, at the center of the computing architecture – dramatically increasing the scale of computational power that we can achieve.”

And the news drew immediate positive reaction.

Notes The Wall Street Journal: HPE “has reached a milestone in a high-profile plan to deliver a new kind of computer. But the company is betting its business may benefit as much from components it developed for the project as from the complete system.”

Adds Fortune:”The technology giant hopes that its research effort will revolutionize how computers are built so that they can more quickly process data.”

Crunching data

While scientists and engineers have been worried and fighting to keep chip processing power ahead of demand, Whitman says HPE is tackling the ever-growing need for more power to crunch data in a different way.

“Since the beginning of the computing age, we have come to expect more and more from technology and what it can deliver–faster, more powerful, more connected,” she wrote.

“But today’s computing systems – from smartphones to supercomputers – are based on a fundamental platform that hasn’t changed in sixty years. That platform is now approaching a maturation phase that, left unchecked, may limit the possibilities we see for the future. Our ability to process, store, manage and secure information is being challenged at a rate and scale we’ve never seen before.”

The Journal notes that HPE engineers have “booted up” a prototype that it describes as “an unusual system that leans heavily on memory technology to boost calculating speed.”

Fortune adds that the Machine “unlike a traditional server, uses advanced photonics (the transmission of information via light) to help processors more quickly access data from memory chips.”

The approach is so different aht HPE has had to redesign servers, from chips to chassis, Fortune adds.

While availability is two or three years away, Whitman pointed out: “The Machine research program demonstrates that Memory-Driven Computing is not just possible, but a reality. Working with this prototype, HPE will be able to evaluate, test and verify the key functional aspects of this new computer architecture and future system designs, setting the industry blueprint for the future of technology.”