n an apparent effort to sell massive number crunching facilities to mere companies, rather than national security agencies, nuclear labs and well-funded universities, the company has created a smaller and less powerful version of its Blue Gene/L supercomputer.

It said Tuesday it would hire out the use of the Watson Blue Gene system (nicknamed BGW) to companies, research institutes and universities with less stellar budgets aiming to solve a difficult mathematical problem on a case by case basis.

BGW, which has a processing speed of 91.29 teraflops, is the most powerful privately-owned supercomputer, IBM said. It added that, despite its size, it was still expected to rank as one of the top three supercomputers in the world, behind the US$100 million Blue Gene/L at number one and NASA's 10,240-processor Columbia supercomputer at number two.

Blue Gene/L resides at the Lawrence Livermore National Laboratory and runs at 135.3 teraflops according to independent testers but according to IBM has a theoretical peak capacity of 360 teraflops. It is the size of half a tennis court or 64 server racks.

IBM has installed the new BGW at its Thomas J. Watson Research Center in New York state where it says clients can submit problems to be solved. The company said it hoped that the machine could help make breakthroughs in life sciences, hydrodynamics, materials sciences, quantum chemistry and fluid dynamics as well as business applications.

"Researchers, scientists, engineers and inventors can now ask more questions, test more theories, try more designs, and simulate more conditions than has been possible before," said Tilak Agerwala, vice president of systems at IBM Research.

IBM said it intended to provide access to BGW resources for academic and industrial researchers as part of the US Department of Energy's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program.

IBM said the DoE recently expanded the program to offer selected outside parties access to the larger 64,000-processor Blue Gene/L system at Argonne National Laboratories. IBM would now supplement access to that system with up to 5 percent of the cycles available on BGW. "The program seeks computationally intensive, large-scale research projects and encourages proposals from universities, other research institutions and industry," it said.

The first project for BGW will be for drug development, involving protein dynamics simulations.

BGW, which comprises 20 refrigerator-sized racks, would also be used by IBM's Center for Business Optimization (CBO), IBM said. This new consulting and software unit taps into IBM's mathematicians, industry and computing expertise to tackle clients' previously unsolvable problems, it said.

The company pointed out the machine could use sophisticated weather forecasting software to feed predictive models for disaster response, and public electricity demand forecasting. It could also track and analyze world financial markets, IBM added.

Big Blue said BGW would bring supercomputing to the point where it could be made more widely available and applied to a broader set of applications. "The speed and performance of yesterday's supercomputers had limited value and accessibility due to their high cost and large size limiting their use to only a select few at national labs," the company conceded.

This would take the company in a different direction from supercomputing rivals. Just a few weeks ago, the Japanese government established a program with NEC, Hitachi, and several universities to develop a supercomputer by 2011 able to perform three quadrillion calculations per second (three petaflops, with a petaflop being 1,000 teraflops) -- knocking Blue Gene/L and friends into the water. It would be used for biotechnology and nanotechnology research, the Ministry of Education, Culture, Sports, Science and Technology said.

Last year, the US National Center for Supercomputing Applications (NCSA) asked Silicon Graphics to make for it the world's largest Linux supercomputer. The six-teraflop machine was to be used for weather-data analysis, simulations of black-hole collisions and other large-scale events in the evolution of the universe.

SOURCE