[ Pobierz całość w formacie PDF ]
systems and networks, the digital millwork of the modern company, have become steadily more
complex as their applications have multiplied. One of the main reasons for the complexity is the
historical lack of standards in computing hardware and software. Vendors have tended to
promote their own proprietary products, which by design don t mesh well with competitors
gear. As a result, corporate software programs have generally been written to run on a particular
operating system, a particular microchip, a particular database, and a particular hardware setup.
Unlike the multipurpose mainframes, most server computers have had to be used as
single-purpose machines, dedicated to running just one software application or one database.
Whenever a company buys or writes a new application, it has to purchase and install another set
of dedicated computers. Each of these computers, moreover, has to be configured to handle the
peak theoretical demand for the application it runs even if the peak load is rarely or ever
reached.
The proliferation of single-purpose systems has resulted in extraordinarily low levels of
capacity utilization. One recent study of six corporate data centers revealed that most of their
1,000 servers were using less than a quarter of their available processing power. Other studies
indicate that data storage systems are almost equally underused, with capacity utilization
averaging between 25 and 50 percent. Before the PC age, data-processing professionals viewed
the conservation of computing resources as not just an economic imperative but an ethical one.
To waste a CPU cycle or a byte of memory was an embarrassing lapse, recalls the science
writer Brian Hayes. To clobber a small problem with a big computer was considered tasteless,
and unsporting, like trout fishing with dynamite. The client server model killed the
conservation ethic. Profligacy replaced frugality as the defining characteristic of business
computing.
The complexity and inefficiency of the client server model have fed on themselves over
the last quarter century. As companies continue to add more applications, they have to expand
their data centers, install new machines, reprogram old ones, and hire ever larger numbers of
technicians to keep everything running. When you also take into account that businesses have to
buy backup equipment in case a server or storage system fails, you realize that, as studies
indicate, most of the many trillions of dollars that companies have invested into information
technology have gone to waste.
And there are other costs as well. As data centers have expanded and become more
densely packed with computers, electricity consumption has skyrocketed. According to a
December 2005 study by the Department of Energy s Lawrence Berkeley National Laboratory, a
modern corporate data center can use up to 100 times as much energy per square foot as a
typical office building. The researchers found that a company can spend upwards of $1 million
per month on the electricity required to run a single big data center. And the electric bill
continues to mount rapidly as servers proliferate and computer chips become more powerful and
power-hungry. Luiz André Barroso, a computer engineer with Google, concludes that, barring
substantial improvements in the efficiency of computers, over the next few years, power costs
could easily overtake hardware costs, possibly by a large margin.
The waste inherent in client server computing is onerous for individual companies. But
the picture gets worse much worse when you look at entire industries. Most of the software
and almost all of the hardware that companies use today are essentially the same as the hardware
and software their competitors use. Computers, storage systems, networking gear, and most
widely used applications have all become commodities from the standpoint of the businesses that
buy them. They don t distinguish one company from the next. The same goes for the employees
who staff IT departments. Most perform routine maintenance chores exactly the same tasks
that their counterparts in other companies carry out. The replication of tens of thousands of
independent data centers, all using similar hardware, running similar software, and employing
similar kinds of workers, has imposed severe penalties on the economy. It has led to the
overbuilding of IT assets in almost every sector of industry, dampening the productivity gains
that can spring from computer automation.
The leading IT vendors have ridden the investment wave to become some of the world s
fastest-growing and most profitable businesses. Bill Gates s company is a perfect case in point.
Almost every company of any size today buys copies of Microsoft Windows and Microsoft
Office for all its white-collar workers, installing the software individually on every PC and
upgrading the programs routinely. Most also run at least some of their servers on a version of the
Windows operating system and install other expensive Microsoft programs in their data centers,
such as the Exchange software used to manage email systems. In the three decades since its
founding, Microsoft grew to have annual sales of nearly $50 billion, annual profits of more than
$12 billion, and more than $30 billion of cash in the bank. And Microsoft has plenty of company,
from other software makers like Oracle and SAP to server suppliers like IBM and
Hewlett Packard to PC vendors like Dell to the hundreds of consulting firms that feed off the
complexity of modern business computing. They ve all happily played the role of weapons
suppliers in the IT arms race.
WHY HAS COMPUTING progressed in such a seemingly dysfunctional way? Why has
the personalization of computers been accompanied by such complexity and waste? The reason
is fairly simple. It comes down to two laws. The first and most famous was formulated in 1965
by the brilliant Intel engineer Gordon Moore. Moore s Law says that the power of
[ Pobierz całość w formacie PDF ]