Monday, July 5, 2010

Bundling Hardware and Software to Do Big Jobs.

In data-center computing, the big trend today is to move from building blocks to bundles.

John Marshall Mantel for The New York Times

Rodney Adkins, senior vice president for systems and technology at I.B.M., which is ahead in developing customized systems.

Suppliers are offering customers assembled bundles of hardware and software to make it easier and less expensive for customers to cope with the Internet-era surge in data — an information flood coming from internal databases, but also from Web-based collaboration and smartphone applications, sensors that monitor electrical use, environmental contamination and food shipments, even biological and genetic research.The shift to packaging hardware and software together is behind the recent big deals and partnerships in the technology industry: Oracle’s purchase of SunMicrosystems for $7.4 billion, an alliance between Hewlett-Packard and Microsoft announced last month, and a similar partnership between Cisco Systems and EMC.


But computer scientists at universities and technology companies say that simply putting the hardware and software building blocks together more efficiently for customers is not enough.


“The huge challenge is to take all this data and generate useful knowledge from it,” said Kunle Olukotun, a computer scientist at Stanford. “It’s an enormous opportunity in science and business, but it also presents a massive computing problem.”


The path to intelligently mining the explosion of data, Mr. Olukotun said, involves new approaches to breaking down computing tasks into subtasks that can be processed simultaneously — a concept known as parallel computing — and new system designs optimized for specific kinds of work.


Designing computer systems around the work to be done is a departure from the dominant approach of general-purpose design, in which machines are built to be capable of handling all kinds of chores and are then programmed to do specific tasks.

Several companies are beginning to bring workload-optimized systems design into the mainstream of corporate and government computing. The promise, analysts say, is to not only open the door to exploiting the data flood for competitive advantage, but also to reduce energy costs and help automate the management and administration of computer systems — a labor-intensive expense that is rising four times faster than the cost of hardware.

I.B.M., according to industry analysts, is at the forefront of the effort to develop more customized systems. And on Monday, the company is making the first of a series of announcements this year that embody the new approach.


I.B.M. is introducing a line of big computer servers using its Power 7 32-core microprocessors. They are priced at $190,000, typically run Unix or Linux, and are aimed at industries like finance and utilities, as well as scientific researchers. Next month, I.B.M. plans to unveil far less costly server systems based on industry-standard microprocessors made by Intel. Those machines, which typically run Unix or Microsoft Windows, will be used for Web collaboration, e-mail and other applications.

“These are not simply hardware products, but the result of years of work and investment at every level from the silicon up through the software,” said Rodney C. Adkins, I.B.M.’s senior vice president for systems and technology. “And the real challenge is to optimize it all, not just the hardware.”

The early deployment of so-called smart utility grids points to the challenges of handling ever-vaster amounts of data. Smart electric meters can measure room temperatures and energy use hourly or at 15-minute intervals instead of the old pattern of utility service workers reading electro-mechanical meters every month or two.

The goal of smart grids, which governments are starting to heavily subsidize, is to give households and businesses timely information so they can change their electricity consumption habits to reduce energy use and pollution, and save money.


That involves not only collecting the data, but also analyzing and presenting it to consumers in ways that are easily understood — typically a personalized Web site graphically showing household electricity consumption and pricing.

EMeter, a maker of smart-grid software in San Mateo, Calif., said that using I.B.M.’s workload-tuned P-7 systems should more than double its capacity to manage smart meters, bringing it up to 50 million. In one eMeter project, the utilities in Ontario are going to install 4.5 million smart meters by 2011. Before, meters were read once every month or two. Under the digital system, readings will be made hourly, a hundredfold increase in the data generated.

“If you can’t continually measure energy use down to the granular level in homes and businesses, the smart grid doesn’t work,” said Scott Smith, director of technical solutions at eMeter. “You need a tremendous amount of computing power to do that at scale.”

Computer scientists, biologists and researchers from Rice University and the Texas Medical Center have been working with I.B.M. scientists to fine-tune P-7 systems for cancer research. Genetic and protein-folding simulations require vast amounts of specialized high-speed processing and computer memory, said Kamran Khan, vice provost for information technology at Rice.

I.B.M., he said, dispatched three Ph.D. biologists to work with the cancer researchers. “They really understand computational biology,” Mr. Khan said.

The more bespoke approach to computer-systems design does increase the risk that customers are locked into one or two powerful companies. Indeed, the reason data-center customers long preferred the building-block approach to hardware and software was that it guaranteed competition among suppliers.

But the tradeoff has been that customers had to put the hardware and software together themselves, even as computing complexity and data-handling demands surged. Many companies, it seems, are willing to accept less competition among suppliers for the convenience and cost savings from prepackaged systems.

“It’s a balancing act,” said Frank Gens, chief analyst at IDC. “It does take away some choice, but also makes things a lot simpler. And that, after all, is the model that Apple used so successfully in consumer technology.”

No comments:

Post a Comment