|HOME||PUBLIC LIBRARY||ANOTHER PERSPECTIVE||INFOPERSPECTIVES||CONTACT|
They are unique. They were once abundant. They are perfectly adapted to an environment that is changing. There is still a significant population, but life is increasingly difficult for each succeeding generation. They may be unable to migrate, and if they can't, the population will soon disappear. What are we talking about? Dodos in the 17th century. And, possibly, iSeries systems today.
Blame Charles Darwin, if you want. More than 150 years after the dodo was dead, he observed that evolution was a process in which the species that survived were the fittest. He didn't mean the strongest, smartest, prettiest, or most vicious. He meant the creatures that were best suited to the environment in which they happened to live. He observed that variations in plant and animal species occur naturally, and that some variations survive while others do not. Evolution, in his estimation, was the result of this natural selection.
Darwin's ideas are not only important in themselves; the contrapositive of his concept, that those who do not survive must be somehow unfit, along with other twists of his powerful notion, have been used to justify all sorts of social behavior, some of which is unworthy of justification, much less justification by association with the great 19th century naturalist.
Still, the idea that winners deserve to be winners is very pleasing to winners. This is true whether the winner is an individual, a group, a company, or a nation. It all boils down to might makes right, something Darwin did not happen to say was the basis of evolution — or anything else, for that matter.
But the things Darwin was getting at do seem to apply in situations quite distinct from tortoise life on the Galapagos Islands and the coming and going of the dodo on Mauritius. There are useful analogies between Darwin's view of nature and less distant events right in front of our own noses, when our noses happen to be pointing at the screen of a workstation.
Computers are not living things, but they can be said to evolve in their own fashion. Some species of hardware and software prevail, while others die out. The process of selection is hardly natural, of course. To some extent, it is a matter of economics, with voters participating in an election based not on one vote per user but on one vote per unit of currency. But it's not a fair and open election, and perhaps that makes it a bit more natural. The vendors are always trying to rig the votes, and their apparatchiks sometimes do a very good job of this. However, customers and vendors often get unexpected results, outcomes that at best may be understood only when it is far too late to do anything about them.
Nature lets extinct species lie. Industrial development is not always a one-way process, but its reversion to technologies of the past is a very rare occurrence, and in some cases it is pretty easy to rule out any reversal. So it is usually safe to say that when an industrial product line dies out, it's gone forever.
The iSeries, which is recognizably descended from the System/38 via the AS/400, has not evolved as rapidly as some other architectures, such as those based on Intel chips, but it has progressed more rapidly than, for example, zSeries mainframes, the most powerful IBM computers. The evolution of the iSeries seems to have two themes, both of which are consistent with the developments occurring among other architectures. Processors have become more powerful. And the internal architecture of the machines has been radically changed to reduce production costs.
The first of these themes is an aspect of the general trend in computing devices. This aspect of the iSeries evolution is due in part to improvements in the processor chips used in the machines and in part a result of changes in other structural elements, such as memory and input/output subsystems. The result is that customers can get much faster systems than they used to. Actually, it's not always an option. At the low end, the power of entry-level systems keeps rising, much the way the least costly PC on the market today is much faster than any PC built a few years ago.
The other kind of change that has enabled iSeries manufacturing costs to fall has not been merely the byproduct of progress in electronic components; the iSeries has become less costly to make because it is a computing magpie, hatched in the nest of a pSeries Unix platform. As a result, the iSeries takes advantage of standardized components, some of them IBM's de facto standards (such as those in the Power chip family), but most of them industry standards that affect the design of memory and choice of component interconnection schemes.
One consequence is apparent in the lower prices IBM asks of customers. But this benefit has been offset by a related cost: recompiling applications developed for the old processors (like the CISC chips IBM developed for the AS/400) so they can run properly in the part-real, part-virtual environment of the iSeries and pSeries boxes. OS/400 applications jumping the CISC-to-RISC gap get compiled on the fly the first time they run on a new RISC processor, and application code is always compiled to an intermediate abstract layer (like a Java byte code) that allows it to move unchanged from, for example, a V3R7 CISC AS/400 to a V4R2 RISC AS/400. Almost all applications run just fine without any additional investment, but cautious iSeries users generally feel that their code should be kept as current as possible. They tweak and recompile their applications to take advantage of the most recent hardware and OS/400 features. If they didn't do this, their jobs wouldn't run as fast as they should. That's a cost, too.
While users have indeed gotten new features as they move to each successive generation of hardware and software, the same features could have been added to older systems if IBM had decided to fund an independent hardware development effort for the iSeries. But IBM felt otherwise. It decided that its iSeries manufacturing costs had to track the costs of Unix platforms. Apparently, the best way to do this was to build a single AS/400-iSeries and RS/6000-pSeries platform, with these platforms suitably dressed in different operating systems and microcode to provide product differentiation.
The problem with this approach is that it has altered the user's perception of the iSeries. Asked to pay more for their machines than for comparable Unix platforms, users felt they were being punished for their reliance on IBM to keep iSeries pricing competitive so their computing costs would remain competitive with the costs at companies using Unix, Linux, or Windows servers. While IBM has maintained that the total cost of ownership of the iSeries is at least as good as the TCO of alternatives, customers have their doubts. The prices of Intel- and RISC-based servers seem to be falling all the time, while the prices of iSeries computers are reviewed less often. IBM may offer special deals to keep the list price gap under control, but these offers can be so complicated and involve so many contingencies (such as discounts on additional hardware or software that a customer may or may not ever buy) that they require customers to build ornate spreadsheets before they can see how the deals might affect them. There are special deals from every vendor on Windows and Unix boxes, too, but they are usually a lot easier to understand, and because they seem more straightforward they make the iSeries deals appear, well, less savory. When IBM does play catch-up, it is often a bit late. The recent January revamping of the iSeries is an example of this.
It's not just IBM that has changed, to be sure. Customers just don't think the way they used to. The way they used to think was more or less the way IBM wanted them to. Today, they seem to make up their minds for themselves. The consequences of this more independent attitude, in combination with various decisions IBM made during the past several years, are only now becoming apparent. The upshot is that it is harder for IBM (and its agents) to sell iSeries systems than it used to be, particularly to users getting their first servers.
This problem at the entry level is very serious. It is not apparent in the data IBM offers its shareholders, because the big money comes from the big iSeries sites, not the little ones. But if the situation in the mainframe business, which offers IBM's other proprietary environments, is any kind of guide, iSeries users could be in for some unpleasant developments — and a harder time than mainframe users are having.
IBM no longer offers entry-level mainframes. Basically, there are no new mainframe users and all the action in that market is due to customers buying upgrades, replacing older machines with new ones, or adding additional machines. Whether mainframes are efficient, compared with, for instance, big Unix boxes, in the mainframe arena the users are likely to compete with other companies using the very same equipment. The result is that there is no competitive advantage or disadvantage that arises from one company using a mainframe while another uses a completely different platform.
That is not the case in the midrange market, where in just about every segment some companies use an iSeries while a rival might be using Unix, Linux, or Windows servers to support functionally similar (or identical) applications. All other factors being equal (or balancing out), if alternatives to the iSeries provide a significant business advantage or disadvantage, users of those systems will gain or lose at the bottom line. The user of an iSeries will grow or shrink as a result, and with that change in fortune will come a change in computing requirements. The iSeries user base will spiral up or down, depending on whether the iSeries is a significant benefit or detriment.
Market data suggests that the iSeries is a diminishing part of the midrange computing market, and IBM has to reverse that trend because it will feed on itself. But this is difficult, perhaps impossible, unless IBM can get a lot more first-time server users to choose an iSeries rather than an alternative. A growing user base is not only important as a seed for IBM's future sales of iSeries technology, but also the basis on which ISVs decide how to allocate their development resources. If the market for their iSeries software looks like it will stagnate and opportunities on other platforms seem more promising, the ISVs will put their scarce resources where future opportunities appear to lie. The applications software advantage held by the iSeries could slip away. The barriers to migration off the iSeries posed by a lack of familiar applications on other platforms could fall, too.
Under the most difficult conditions imaginable, which would be if IBM gave up on its effort to court new users, the iSeries would fade away. There might not be any obvious signs of this, or any sudden change in the way the iSeries world looks to users. But the dodo didn't get killed off all at once, either.
Briefly, in 1598, Portuguese sailors landed on Mauritius, the main island of the eponymous group of islands that lie east of Madagascar in the Indian Ocean. The sailors saw some big birds, birds the size of major league turkeys weighing as much as 50 pounds. These birds were totally unafraid of the sailors, as they had no natural enemies on Mauritius. Dodo became a major item on the menus of visitors to the island, but that is not what killed them off. On successive visits, sailors left behind pigs and dogs, some of which produced feral offspring, and these animals eventually ate all the dodo eggs. This was made possible by one aspect of the dodo's physiology. Although a bird related to such members of the pigeon family as the solitaire, which lives on Reunion Island, the dodo could not fly, so it nested on the ground ... and never learned to guard its nests from predators.
By 1681, nobody saw any more dodos. There are some skeletal remains in museums, including the natural history museum at Oxford University, but there are no complete specimens. What little we know of the bird's appearance is based on paintings done when the creatures were still alive and some speculative work completed by naturalists, plus an appearance in Alice in Wonderland, written by Lewis Carroll (Charles Dodgson), a contemporary of Darwin, and illustrated by Sir John Tenniel.
— Hesh Wiener April 2003