Technology News Technology News



Another Perspective



DARNED COYOTE

It's a common enough experience.  You ask yourself, "What am I doing here?"  Don't think of this as a moment of doubt or confusion.  Think of it as an introduction to cosmology, and as one of the many questions for which there may be no answer — not yet, maybe not ever — at least for mankind.  But don't let that discourage you or keep you from asking other difficult questions such as, "Just how many different computer architectures do we need?"

Of course, the very mention of questions like these may annoy you.  If that's the case, and you're looking for people to blame for silly questions, it's not unreasonable to start with the British.  These days, if you want to know what's up with cosmology, you probably have to pay attention to Stephen Hawking, a Brit.  If your focus is a bit more narrow, and confined to more recent events, such as things that happened since life appeared on earth, you'll undoubtedly know that a lot of the big questions were asked by Charles Darwin, whose The Origin of Species was published in 1859 and about which arguments have never ceased.  People who think Darwin's ideas run against belief in a Creator are supported by William Paley, a Brit whose last major work, Natural Theology, published in 1802, made a strong and long-lasting case for a superior intelligence working behind the scenes.  Paley's classic argument is that, if you were walking in a field and found a rock, you'd think everything there was natural, but if you found a watch you'd realize that at some point there was evidence of an intelligence having passed by.  Paley found lots of "watches" in the world around him.

William Paley
William Paley
"You can deny the existence of
a Creator, but not on my watch"

Arguments about computer architectures don't get tangled up in theological constructs, even though the scientists, mathematicians, and engineers whose work has shaped computing include people with just about any point of view regarding humankind, religion, or philosophy you can imagine.  But if you want to locate people whose concepts lie at the heart of stunning insights and heartfelt intellectual controversies, you could do a lot worse than point a finger, or Galileo's finger, at Alan Turing, yet another Brit.  His work showed that, in essence, any computer ought to be able to mimic any other computer, although that work dealt with linear rather than parallel computing.  Whether linear computers or von Neumann (not a Brit) type machines are the same as parallel computers, if you apply suitable transformations, is another matter and one we're not equipped to deal with here and now.

Boiled down, Turing's mathematics and the practical experience of computer scientists make it clear that you can port any program to any computer architecture, although not necessarily with as much efficiency as you would prefer.  In theory, then, any architecture that becomes cheaper to build could sweep the market.  In practice, users are locked in by applications that require too much costly recoding and storage translation to float among the architectures.  But for the past several years, much of the growth in computing has been in standards-based equipment, and it's hard to imagine that China and India, each given a chance to learn from the rest of the world and to start with a relatively clean slate, won't try to do things a bit more cleverly before they, too, succumb to inertia and to the drag of legacy technologies.

In the very short term and in the narrowest sense, your immediate choices might be limited to boxes based on Itanium, Inteloid-64 (of which there are at least two flavors, Intel's Xeon and AMD's Opteron), Power, Sparc, and IBM mainframe architectures.  The processors in cell phones and PDAs are unlikely to grow up; it's more likely the processors in computers shrink down.

This is not merely a puzzle you contemplate for the fun of it.  The wisdom of decisions you make every day may depend on whether all these architectures survive, or, given the many good ideas that keep cropping up in computing, whether a new generation of processor chips will emerge and reshape the landscape.  Do you just focus on the tactics, or should you be thinking in terms of a larger strategy?  Or does the best approach involve some of each?

Users deeply committed to a particular architecture are a bit like the protagonist in Flatland, written by another Brit, Edwin Abbott Abbott (yes, yes, that's his name), and published in 1884.  The book is set in a two-dimensional world where a character is given a glimpse of a three-dimensional world.  It's about mathematics, but it's also about understanding that there may be things we cannot see in all their dimensions, because we live in a restricted world.

Cosmologists may seem to be addressing completely unrelated issues, and very esoteric ones at that, but in fact their work is more immediate matters than you might think.  The debates in that crowd are tied to very concrete work in particle physics and involve discussions of physical constants as much as philosophical abstractions.  One recent book that presents an overview of recent thought and tries to show the connections between big ideas and the daily observations of scientists in a number of fields is Mike Mallary's Our Improbable Universe.  Mallary is a distinguished physicist, a fellow who has contributed in a major way to the development of magnetic storage, and, I must disclose, since I highly recommend the book, a personal friend.  If you've never explored the boundaries where science meets philosophy, this is as good a travel guide as you're likely to find.

Right now might be just the time to get your bearings.  IBM, Sony, and Toshiba say they have a new circuit that is going to change the whole world of computing, starting in your living room, where their forthcoming processor technology, called Cell, will find its first home.  Promotion for the new technology, other than press releases and technical papers aimed at the trade, has not begun, but selling something as exotic as Cell to consumers is not going to be easy.  It will require the right spokesperson.  We nominate Martha Stewart.

With or without Martha, the Cell dog-and-pony shows are a clear sign that there still might be room to innovate in the processor world.  In fact, the big, old technology companies may have come around to thinking that they might desperately need to foster change, if they don't want to die a death of a thousand commodity price cuts.  If they're right, and they might be, the changes they engender might or might not go over with their customers.

Change brings with it benefits, but also costs and risks.  People who buy computers and software and related things might welcome the benefits, but they don't particularly like the costs and risks.  This reluctance to change is the basis of legacy markets.  It's worth a lot, and the evidence to support this comes in the form of prices paid for legacy computing products, compared with the prices paid for newer and different products that always threaten and sometimes devour the markets for legacy products.

It's all about survival of the fittest, and what makes for fittest in the world of technology is often something that's not quite as good but is much more affordable.  The allure of bargain technology has been most intelligently explored by gentle giant Clay Christensen in The Innovator's Dilemma, in 1997, and elsewhere.  Basically, Christensen shows how, in markets as diverse as those for disk drives and steam shovels, new technologies almost inevitably displace older technologies by coming in cheaper, with less functionality, and growing to engulf the territory occupied by their predecessors.

In the living world, such processes of succession and displacement are called evolution, and we now understand a little bit about how this happens, at least at the molecular level.  We also understand a little about how a place where this can occur, Earth, might have been formed, or at least how physicists, astronomers and those dreaded cosmologists, including many British folk, whose entire culture at times seems to be best explained by noting how often they ask, "What are we doing here anyway?" believe we got where we are.

It turns out, as you might well know, that quite a few things have to be just right for life to exist on Earth.  It is, as our friend Mallary says, improbable.  And contemplation of the unlikeliness of it all, which has intrigued scientists and philosophers since time immemorial, has during the past few decades been expressed in part by an idea called the anthropic principle.  This concept, which is the current shorthand for a phrase made popular in academic circles by The Anthropic Cosmological Principle, which unsurprisingly comes from Oxford University, stems from the tautology that if we're here asking questions, clearly the universe as we know it is capable of supporting intelligent life, also as we know it.  This leads thinkers to ask that same old big question, "So how come?"

While a lot of the thinking surrounding this, and other matters dear to the hearts of cosmologists, is about space-time and fundamental laws of physics, some of it inevitably trips over the question, put perhaps too starkly, of whether things are the way they are because of chance or, alternatively, because there is a creator or a designer of some kind behind it all.  Serious thinkers always point out that between the extremes lies a vast area that may be thought of as worlds that were started by a creator and then left to run according to some set of physical rules.  Even so, people get pretty heated up when they discuss this matter.  You can Google for "anthropic principle," and you'll see just how stimulating cosmology can be.  There are plenty of serious, carefully edited sites devoted to this topic and related topics, plus a vast number of do-it-yourself sites built by what Americans call bloggers and Brits call wankers.

Coyote
Coyote
"The very instant you think you've seen me,
I'm somewhere else, just like it says in
this book on quantum mechanics."

There might be as many wise, and as many foolish, arguments here as there are about computer technology, but we haven't actually tried to count, either.  What we have noticed in cosmology and computing is that a large and, we reckon, excessive amount of thought seems to have gone into futile contemplation of what or who is responsible for the way things are unfolding.  In computing, the facts may show how various technologies were discovered or invented or developed, and sometimes there is a little information about how various ideas managed to catch on, but pretty soon the discourse flows into predictions about the future of computing, and these are overpriced at a dime a dozen.

Good cosmology and good science are often similarly extended and, most embarrassingly to anyone who has actually tried to put a little serious thought into such topics, by people whose hubris extends to the realm of dealing with the idea of a creator as if that creator were a guest on a TV talk show.  It is here that I feel compelled to raise a question of my own.

Every culture has its theories about creators; sometimes there's one, sometimes there are many.  This is as natural as the inevitable (if occasional) wondering by any intelligent being about just why we happen to be here.

But just look at the world we live in.  Look at how we behave.  Look at how we treat each other, nature, and even life itself.  A creator of this world might not be one of the creators we may have been raised to believe in, or which one we discovered through study, meditation, or life itself.  It just might be, considering the way things are, the way they have been and the way they seem to be going, that the most likely kind of creator who gave us this world, with its peculiar physical constants, odd subatomic particles, almost inexplicable physical forces is the Coyote, the Trickster of Native American culture.  (Okay, Mordechai Richler, in Solomon Gursky Was Here, favored the Raven as Trickster, and if you enjoy a good potlatch, which is a very American thing these days, you're welcome to go that way, too.)

The Trickster is a complex character, and can be counted on to be good, bad, funny, disruptive, smart, and stupid among other things, whenever you expect otherwise.

Sometimes the universe seems like that, too.  And every now and then even the finest and best behaved part of the universe, the computer industry, makes one wonder if there isn't a Trickster behind the scenes.

Now stop asking questions you'll never get answered and get back to work.

— Hesh Wiener February 2005


Copyright © 1990-2017 Technology News of America Co Inc.  All rights reserved.