Evolution Shift
A Future Look at Today
August 6th, 2018

The World’s Fastest Computer

Recently, I visited Oak Ridge Laboratories in Knoxville, Tennessee. If that name sounds familiar, it is because it was one of the key places where the nuclear age was launched. It is the home of the Graphite Reactor where at 5 a.m. Nov. 4, 1943, spontaneous nuclear reactions began. Along with Chicago Pile 1 at the University of Chicago, that is where nuclear fission was born.

Ever since then, Oak Ridge National Laboratories has been one of the primary major research labs of the United States. It is at the cutting edge of additive (also known as 3-D) printing and supercomputers and will begin work on new generations of nuclear power with commercial reactors that will be the size of a trashcan or washing machine. I will be visiting in a year to check on those developments.

But this column is about Summit, which on June 8 became the world’s fastest supercomputer, which is based at Oak Ridge Labs. I have seen several supercomputers in the past decade, but never the one that was the fastest in the world at the time, so this visit was truly exciting for me.

Prior supercomputers at Oak Ridge that were, for short periods of time, the fastest in the world were named Titan, and before that Jaguar. The man who has overseen the operation and administration of all three at the lab is a man named Arthur Bland, who everyone in town calls Buddy. Bland spent an hour with me, answering not only my specific questions about Summit but also about the history of supercomputers.

The Department of Energy basically invented supercomputers for the nuclear energy program to measure “criticality” — the point at which self-sustaining nuclear chain reactions occur. Initially, computing was done by people, then after World War II by new mainframe computers. Once American nuclear testing stopped in 1992, supercomputers became the way to certify that the country’s nuclear weapon arsenal was “safe and effective” — safe in that nothing would blow up when it shouldn’t and effective in that explosions would occur if they were needed.

The Department of Energy also started to use supercomputers for all aspects of energy research, for high-level projections of increasing energy efficiencies and, of course, for studying weather and climate.

Anyone who has heard me speak knows that I think the smart hand-held device is one of the most — if not the most — transformative technological developments in history. My Apple iPhone 8+ has far more computing power than possessed by all of NASA when John Glenn orbited the earth.

Bland confirmed that my hand-held device is much more powerful and faster than the first Cray Supercomputers of the mid-to-late 1970s. Think about that: If you have any recently purchased smartphone, it is dramatically more powerful than the supercomputers of the 1970s, and it’s small enough to hold it in our hand instead of occupying an entire 10,000-square-foot airconditioned room.

Only some major research universities and highly secure government research labs such as Sandia National Laboratories in Albuquerque, New Mexico, and Livermore, California; the National Renewable Energy Laboratory, in Golden, Colorado; Los Alamos, in New Mexico; and Oak Ridge had access to the supercomputers in the last quarter of the 20th century. Now we all can hold that speed and power in our hands.

Evidently, the development of supercomputers is at a stage when each new iteration increases computing power exponentially. The Titan had 10 times the performance of Jaguar using only 10 percent more power. The Summit has 10 times the performance of Titan with no increase in power usage.

Jaguar had a peak performance of 2.6 quadrillion calculations per second. A quadrillion is 1,000 trillion. Titan had a peak performance of 27 quadrillion calculations per second; so a 10 times increase in performance with only a 10 percent increase in power. Summit can do 200 quadrillion calculations per second with no power increase. And in working with the chipmaker Nvidia, Oak Ridge had Summit’s graphics processing unit chips configured so that, if 64-bit calculations are not needed and 16-bit calculations can be used, Summit’s capabilities jump to 3,300 quadrillion calculations per second.

Such numbers are truly beyond human comprehension. In its recent reporting on Summit, the New York Times tried to provide some human scale to these numbers. It stated that Summit was like approximately 20 football stadiums with 100,000 people in each — 2 million people — all using high-powered laptops. A loose analogy at best.

The Summit was created by IBM and Nvidia under the oversight of Oak Ridge National Laboratory. It is water cooled, which dramatically cuts down on power usage, allowing power to be directly applied to computation rather than dissipating the heat created by the computer.

Standing in the room with that amount of computing power was truly overwhelming. Thirty-five years ago, a room that big but much colder would have been needed to house a computer with the power I had in my hand.

We live in accelerating, transformative times for sure!


Act Now

In times of global uncertainty and disruption it takes a futurist to provide context and understanding.

Book David
Stay Connected


Sign up for David’s newsletter on Substack


Subscribe on SubStack