The Future of Computing: Big Ideas or Incremental Improvements?

The Future of Computing: Big Ideas or Incremental Improvements?

The “Rebooting Computing” initiative spearheaded by the IEEE is one of the more interesting efforts designed to tackle the looming technological hurdles facing the IT industry. The effort seeks to encourage researchers and companies to build systems that can provide far greater performance but at the same time consume far less energy. Security will also have to be baked in from the start.

The urgent need for better architectures is being driven in party by the Internet of Things. Precise weather forecasting, for instance, is the sort of application we expect to get from IoT. But, as George Leopold notes, the weather satellites in the U.S. already generate about 20 terabytes of observational data a day. Developing a system that can take advantage of that data in real time will be a tall order.

“The cloud and IoT change things,” Georgia Tech’s Thomas Conte told Leopold in a story on the effort. Together, “they offer a way of enabling the rebooting of computing.”

One Giant Leap or Small Steps?

So how will the computer industry tackle this? Expect to see one of those classic philosophical battles between Big Ideas and Incremental Improvement, Radical Reinvention and Rigorous Retrofitting. Do we solve this problem by pioneering completely new architectures or wringing more out of the technology we already have?

Some of the concepts from the Big Ideas camp include quantum computing, neuromorphic architectures and adiabatic computing. HP’s “The Machine,” a highly elegant system centered around memistors, reduces energy while speeding up transactions by fusing memory and storage.

I can’t criticize the technical merits of these approaches. I don’t even know what adiabatic means and whenever I had to write about quantum computing as a reporter I just pretended I understood it. (“Ah, so you’re focusing on the quantum superposition of paired theoretical particles.. Now I gotcha!”)

But here is where the debate comes in: history shows that steady improvements to existing technologies often beat out revolutionary, leap-ahead ideas. Remember Intel’s Itanium? It was going to change server architecture. But it became irrelevant because of the relentless performance improvements Intel was able to achieve with its own x86 processors.

The Hall of Fame of Ideas That Didn’t Take Flight

Ovonics? It was supposed to replace memory and storage back in the 1970s because lithography was about to run out of gas. Quantum computing is percolating into the market through Amazon and a few other companies, but it’s been a work in progress since the 1980s.

Progress toward many of the goals of Rebooting Computing—lower power consumption, greater performance, greater computing density—are already being achieved through flash memory. The benchmarks make great reading. Better security? Automation encryption is a feature of new SSDs. A fusion between memory and storage? PCIe to some degree already achieves that by bringing large amounts of storage close to the processor to reduce latency. Incremental improvements often aren’t glamourous, but they work.

In the end, we need both approaches, the moonshot-scale ideas that unexpectedly bring us into a new era and the continual stream of improvements that change markets in an almost invisible fashion. Virtualization was a mind-bending idea 20 years ago. Now IT managers use it to drive utilization levels for servers to 90 plus percent. 3D memory, which will become more widespread in the next few years, will have profound effects on computer design, but ten years from now it will seem commonplace.
But in the meantime, it is going to make an interesting spectacle.

Related Stories

AI Evens the Playing Field in Sports