It’s All About the Teraflops
In the 60 year history of computers, there has been a constant improvement of computational speed. Ever faster has always been one of the driving metrics of the industry. Mooreâ€™s Law has been manifested with desktops and laptops to the point where the computers we use are as fast as we need. The machines we use today are incredibly faster that those we used at the turn of the century. The power of these machines however is dwarfed by the super computers now being developed.
It is in the arena of super computers that both the outer and inner reaches of reality can be explored. The advanced computer modeling and the running of complex scenarios and of course the ability to beat a human chess grandmaster is the realm of super computers.
The worldâ€™s fastest computer is being built and installed at the Argonne National Laboratory in the western suburbs of Chicago. IBM Corp. and the Department of Energy, which owns Argonne, have contracted for a new supercomputer that is now being installed with a peak capability of 445 teraflops, or 445 trillion calculations per second. The current record-holder is the Department of Energyâ€™s Lawrence Livermore National Laboratory in California, which has an IBM Blue Gene/L with a peak capability of about 360 teraflops.
To place all this in a historical context, here is a quote from a column written here last year: â€œâ€¦the first mainframe computer, the ENIAC, built in 1946 performed 50,000 calculations per second. Ten years later the IBM 704 mainframe performed at 400,000 per second. By 1982 the number has grown to 100 million for the most powerful mainframe computers in the world.â€
This new computer being installed, when combined with the existing computer at Argonne will provide a computing capacity of 556 teraflops. In addition to this incredible increase in speed is the fact that this new IBM Blue Gene/P computer series consume a fraction of the power per teraflop required by similar systems built previously. This reduces power demands and lowers operating costs. This is a developing trend across computing, from supercomputers down to PCs; the lowering of both energy usage and therefore cost of use, both of which are certainly good trends to embrace and accelerate.
The leap in computational speed represented by this new supercomputer at Argonne will mean a literal exponential increase in speed, whereby something that might have taken four days to produce results might now be done in four hours. Even more mind boggling is the fact that there is talk of a petaflop machine, capable of doing 1,000 trillion calculations per second becoming a reality in the not too distant future.
To make this more personal, it has been estimated that the magnificent parallel computing entity called the human brain operates at a highly approximate speed of 100 trillion calculations per second. Of course the brain operates much more contextually than does a super computer, constantly recalibrating due to our human, emotional needs. Of course for both super computers and human brains there is the old adage of â€œgarbage in, garbage outâ€. Humans and computers can be fed incorrect information so that no matter how fast some problem or scenario may be computed, it will be flawed. A super computer that can process 556 teraflops a second is just another indication that the evolution of technology seems to move faster than human evolution. We cannot even begin to keep up with the rapidity of technological change. We must always remember that technology can and should be used to make human life happier and more productive. A machine that can process 556 teraflops a second is a tool for humanity to use. It is up to us to stay ahead of the process by constantly looking for ways that such a magnificent machine can be utilized for the common good of humanity.