P2P turning desktop PCs into supercomputers



Industry analysts are touting P2P as one of the fastest moving new technologies. AMR Research has named P2P computing to its Outlook 2001: Top 10 Technologies to Watch list. AMR predicts: In 2001, we will see P2P expand rapidly among organizations that require large computing power and, as offerings become more available, in the average enterprise as well.

According to AMR, P2P has the potential to perform functions that would either not be feasible in a standard computing environment or would entail great cost. The group predicts that P2P can save corporations million of dollars through utilization of existing equipment.

The concept is a simple one. P2P basically emulates a mainframe computer by drawing on the idle capacity of multiple desktop computers connected by a network.

And although many of the current popular P2P applications link computers over the Internet, companies like Porivo Technologies are developing P2P applications designed to aggregate companies's internal computing resources connected to corporate networks. These applications allow users to distribute work among internally networked computers without leaving the protection of the company's firewall, offering corporations opportunities to tap existing computing capacity without security concerns.

For major corporations with large, established networks of desktop PCs, it's an opportunity to leverage existing assets in a whole new way. By stringing together thousands of desktop computers, companies can run applications that would ordinarily require expensive mainframes.

Intel, the computer chip maker, uses a P2P application called Netbatch to harness the excess capacity of more than 10,000 workstations across its network to do computing-intensive chip design jobs. Before it began using Netbatch, Intel was buying new mainframes every time it designed a new chip. The company claims it has saved $500 million over the past 10 years using the application.

In a recent Network World Fusion article, Cheryl Currid, President of the Currid & Company consultancy, said P2P's big draw for corporate customers is untapped processing power. What they get from peer-to-peer is low-cost, high capability processing and storage, she said. She estimates that 75% of the average PC and 60% of the average server go unused.

Peer-to-peer technologies are poised to offer solutions to computer-intensive problems like complex molecular and drug interaction modeling that challenge existing R&D computing capabilities of pharmaceutical companies worldwide.

In fact, Andrea Williams Rice, a Managing Director and Internet Analyst at Deutsche Banc, predicts that as the technology matures it will enable companies to undertake even more complex problem solving. Over time, the aggregation of enough computational power will go beyond allowing companies to solve existing problems faster and at a lower cost to tackling tasks that have previously been too computationally intensive to address, she wrote in a CNET.com column.

Tasks like protein folding predictions fall into that category. With existing supercomputers, researchers can fold short sections of proteins, but not full-length amino acid sequences.

IBM is currently developing a supercomputer, specifically for full-length protein folding. The computer, called Blue Gene, will be the world's fastest supercomputer and capable of performing one quadrillion operations a second (a petaflop). But analysts at Forrester Research are questioning whether as more pharmaceutical companies embrace P2P, Blue Gene is at risk of being upstaged by aggregated computing strength.

In addition to problem solving applications, P2P technologies offer opportunities for networked file sharing. These applications are designed to facilitate secure business communications and offer new avenues for knowledge and document management, both internally and with alliance partners.

For instance, Mangomind, a service from Mangosoft, offers a secure, Internet-based way for multiple users to access, share and store files. As users make changes to documents and files, Mangomind automatically updates and synchronizes their files for sharing with other authorized users on the network.

Some P2P collaboration applications designed for sharing medical information with physicians and pharmaceutical sales reps also show promise for use with wireless, Web-enabled devices.

However, P2P is still evolving and many challenges remain. One of the biggest obstacles is the lack of a standardized method for integrating individual PCs, servers and workstations into the aggregate. But late last year a group of computer manufacturers and software companies, including Hewlett-Packard, Compaq, SGI and Platform Computing, formed a collaboration to develop standards for the way computers are harnessed into distributed computing collections.

The new consortium, called the New Productivity Initiative, hopes to make distributed computing part of most standard operating systems, and plans to seek the endorsement of an independent standards organization. The group hopes its efforts make it easier to take advantage of distributed computing across individual organizations and the world.