Facebook Blogging

Edward Hugh has a lively and enjoyable Facebook community where he publishes frequent breaking news economics links and short updates. If you would like to receive these updates on a regular basis and join the debate please invite Edward as a friend by clicking the Facebook link at the top of the right sidebar.

Saturday, May 10, 2003

Commoditisation and Overshoot


Brad Delong, an economics professor at the University of California at Berkeley, puts it somewhat more succinctly: "I am optimistic about technology, but not about profits."
Source: The Economist



The Economist has an interesting survey on the IT industry. Interesting, however, is not always the same as analytically sound. This piece cited below on Google and commoditisation, for example. It is fascinating to read about what you can do on the cheap when you really try, and to see how those who really understand something about IT can extract much more value from it than end users who don't. But it also took me back to a point Joerg made in a piece I posted earlier in the week relating to NASA buying Intel chips on E Bay. The point was to illustrate the fact that the introduction of building block type products follows a non-linear path. The process needs to pass a given threshold and then it takes off. The phenomenon of technological 'overshoot' needs to be understood in this context: today's surplus-to-requirements extravagance may become rapidly converted into the foundation stone of tomorrows growth transition (that I take it is the point made being made at the end of the extract about Tetris). Innovation, in this context, forms part of an overlapping nesting system , which can branch-off in almost any direction without prior warning. This is what makes for the underlying strategic uncertainty which characterises the industry, and what makes it way, way too soon to start talking about maturation in this particular technological revolution where even the expression General Purpose Technology seems to fall well short of the mark. Remember the early Marxists already described late 19th century capitalism as in 'decadent' decline.

..........many IT firms would not be too unhappy if Google were to disappear. They certainly dislike the company's message to the world: you do not need the latest and greatest in technology to offer outstanding services. In the words of Marc Andreessen of Netscape fame, now chief executive of Opsware, a software start-up: “Except applications and services, everything and anything in computing will soon become a commodity.”.............

When the two Stanford drop-outs who founded Google, Sergey Brin and Larry Page, launched the company in 1998, they went to Fry's, an electronics outlet where the Valley's hardcore computer hobbyists have always bought their gear. Even today, some of the data centres' servers appear to be the work of tinkerers: circuit boards are sagging under the weight of processors and hard disks, and components are attached by Velcro straps. One reason for the unusual design is that parts can be easily swapped when they break. But it also allows Google's servers to be made more powerful without having to be replaced completely............

Because Google has always used commodity hardware and software, it is not easy to calculate how much money it has saved. But other firms that have recently switched from proprietary gear say they have significantly reduced their IT bill. Amazon.com, the leading online shopping mall, for instance, managed to cut its quarterly technology spending by almost $20m. The most interesting feature of Google's data centre, however, is that its servers are not powered by high-end chips, and probably will not have Itanium, Intel's most powerful processor, inside for some time yet. This sets Google apart among hot Silicon Valley start-ups, whose business plans are mostly based on taking full advantage of the exponential increase in computing power and similar growth in demand for technology.


......other “laws” of the semiconductor sector are becoming more important, and likely to change its underlying economics. One is the fact that the cost of shrinking transistors also follows an exponential upward curve. This was no problem as long as the IT industry gobbled up new chips, thus helping to spread the cost, says Nick Tredennick, editor of the Gilder Technology Report, a newsletter. But now, argues Mr Tredennick, much of the demand can be satisfied with “value transistors” that offer adequate performance for an application at the lowest possible cost, in the same way as Google's. “The industry has been focused on Moore's law because the transistor wasn't good enough,” he says. “In the future, what engineers do with transistors will be more important than how small they are.”

This is nothing new, counters Paul Otellini, Intel's president. As chips become good enough for certain applications, new applications pop up that demand more and more computing power, he says: once Google starts offering video searches, for instance, it will have to go for bigger machines. But in recent years, Intel itself has shifted its emphasis somewhat from making ever more powerful chips to adding new features, in effect turning its processors into platforms. It recently launched Centrino, a group of chips that includes wireless technology. The Centrino chips are also trying to deal with another, lesser-known, limiting factor in chipmaking: the smaller the processors become, the more power-hungry and the hotter they get (see chart 4). This is because of a phenomenon called leakage, in which current escapes from the circuitry. The resulting heat may be a mere inconvenience for users of high-end laptops, who risk burning their hands or thighs, but it is a serious drawback for untethered devices, where it shortens battery life—and increasingly for data centres as well, as Google again shows.

The firm's servers are densely packed to save space and to allow them to communicate rapidly. The latest design is an eight-foot rack stuffed with 80 machines, four on each level. To keep this computing powerhouse from overheating, it is topped by a ventilation unit which sucks air through a shaft in its centre. In a way, Google is doing to servers what Intel has done to transistors: packing them ever more densely. It is not the machines' innards that count, but how they are put together. Google has thus created a new computing platform, a feat that others are now replicating in a more generalised form. Geoffrey Moore (no relation), chairman of the Chasm Group, a consultancy, and a partner at Mohr, Davidow Ventures, a Silicon Valley venture-capital firm, explains it this way: computing is like a game of Tetris, the computer-game classic; once all the pieces have fallen into place and all the hard problems are solved, a new playing field emerges for others to build on.
Source: The Economist
LINK



And a final thought:

A measure of this increasing complexity is the rapid growth in the IT services industry. According to some estimates, within a decade 200m IT workers will be needed to support a billion people and businesses connected via the internet. Managing a storage system already costs five times as much as buying the system itself, whereas less than 20 years ago the cost of managing the system amounted to only a third of the total...........

Once thing is clear: once all the technical challenges of grid computing have been overcome, hardware will have become a true commodity. Machines, storage devices and networks will lose their identity and feed into pools of resources that can be tapped as needed. This liquefaction of hardware, in turn, will allow computing to become a utility, and software a service delivered online.

And the 24 billion dollar question is: where will these 200m IT jobs (if that is what there are) and all that grid-computing hardware be located?

No comments: