|HOME||PUBLIC LIBRARY||ANOTHER PERSPECTIVE||INFOPERSPECTIVES||CONTACT|
In 1964, as IBM announced the System/360, Marshall McLuhan, a professor at the University of Toronto, published a remarkable book, Understanding Media: The Extensions of Man. He said each medium, independent of its content, is a powerful social force with characteristics that reshape the way people are interconnected.
McLuhan distilled his thesis to a single memorable phrase: the medium is the message. Like print, radio, movies, and television, computing technologies, from the punch card to the mainframe to the mobile internet, are media, too. IBM doesn't fully understand this; consequently, it flails and struggles.
One of McLuhan's observations about media is that they generally carry as content older media. For example, the medium of theatre and the medium of print publishing are the content of films. Films can be the content of television. Television has become part of the content of website presentation. But as one medium uses a predecessor for content, the nature of the newer medium may differ a lot from that of the older one.
As an example, McLuhan characterizes film as a hot medium, by which he means that film in a theatre shown on a big screen with its rich images floods the main sense, vision, with information. The viewer doesn't have to do much work to catch all the details; on the contrary, the viewer may be overwhelmed. Add in surround sound and even without 3D or VR presentation, the audience is awash in stimulus. By contrast, the same film presented on a small television screen, the kind that was the norm during McLuhan's time, 50 years ago, requires the viewer to psychologically lean in, to do some mental work to catch the detail. McLuhan calls the low-res TV of his time a cool medium, his term for a medium that demands effort from a viewer.
Today, TV is designed for two meter diagonal, high resolution, high definition screens. It can provide an experience that more closely resembles the one offered by a cinema than the old round-corner boob tube of the vanished past. But that is only one kind of video experience. A viewer catching a show on a tablet or a smartphone is not immersed. The video experience on high end screens may have a lot of pixels, but it is impossible for a viewer to enjoy all the subtleties presented on a big screen. So that TV show, whether a film or a sports event or a news report, can be hot on a home theatre screen but cooler on a Kindle Fire and very cool on a 5-inch phone.
McLuhan tried to give his readers, his students, and his consulting clients (including IBM) a way to think about the varying impact of different media and the way the recipient of the medium message as well as the content might react to an impact of the presentation. A hot medium such as a big screen film may produce an emotional reaction in a viewer, but that viewer is sitting in one place in the theatre. That same film at home may allow some different freedoms. There may be a pause button that breaks the spell of a movie even as it allows the viewer to grab refreshments or halt to take a phone call. On a tablet or phone, incoming traffic is potentially available at all times. The viewer is constantly reminded of her connection to others including friends, family, and robots offering traffic guidance or weather alerts or email from the bank.
In computing, the information processing technology of McLuhan's time was just beginning its transition from mechanical to electronic functionality. Students at McLuhan's Toronto school who took courses dealing with computers probably submitted their homework as a deck of punch cards and had the resultant output, most likely on green-bar paper, graded by hand or evaluated by software that helped check for the results of a multi-step industrial process. The homework might first have to be compiled and then put in a queue for batch processing. By the time students who entered U of T the year Understanding Media was published completed their four-year degrees, punch cards had given largely way to terminal-based interaction.
But the IT process remained linear and essentially mechanical. It would be several more years before university students would get their own personal computers. Networking didn't become cheap and essentially ubiquitous until the 1980s. And the Macintosh, the first affordable personal computer with a nice graphical user interface, didn't hit the market until 1984, 20 years after McLuhan's book on media. By that time, McLuhan had been dead for about four years. He would not be around to observe the effect of the GUI as an influential and popular medium; he would not be around to see interactive computing in its infancy during the era of AOL and its ilk mature into an Internet with vast search facilities, huge bandwidth, and wireless connectivity capable of bringing electronic payment to rural West Africa and to summon Uber cars to your front door.
Before 1964, IBM had built its business on technology that read a card and printed a line. Some of this technology was still largely mechanical, processing paper cards and sorting or selecting the cards using brushes that felt for punched holes and paper guides that sorted cards into banks of hoppers. The technological high end of IBM's product line was still migrating from electronic systems based on vacuum tube triodes to circuit cards using discrete transistors. Magnetic tape was the emerging storage medium; disks were not yet sufficiently capacious or adequately affordable to displace mag tape. Tape is still a widely used archiving medium, possibly awaiting extinction by disks in the cloud but by no means assured of consignment to the dustbin of history.
IBM's corporate thinking, like that of the contemporary industrial empires that were its customers, mirrored the information processing machines it built. Computing, even as it went electronic, involved breaking a problem down into processing components the way an industry assembly process was divided into tasks. The components were executed in sequence, each receiving as input the output from a prior stage of work, each yielding as output the transformed batch of data.
Until very recently, IBM personnel at work were largely shielded by the transition of computing from punch cards to richly interactive mobile multimedia activity. IBM's System/360 was at first an electronic embodiment of punch card systems and the batch processing technology of earlier computing systems like the IBM 1401. It took IBM a decade to upgrade the 360 to the 370 and even then the early 370 models didn't feature what would quickly become their defining technological advance: virtual memory. Still, by the mid-1970s IBM was showing customers that computing via CRT terminals was a key step on the path to the future. IBM's mainframe processor business and, in parallel, its lines of small and midrange systems, was thriving. But by that time, IBM had begun to lose touch with developments in semiconductor manufacturing, communications technology and software that would trip it up during the 1980s.
Just as the mainframe seemed in some ways to be the cinema version of punch card apparatus, a development that was for all practical purposes unknown to IBM management, the personal computer was turning into the television version of the glass house system. The first personal computer that became known around the world was the MITS Altair 8800, featured in Popular Electronics magazine in 1975. In just a few years, dozens of companies were selling hundreds of thousands of small computers. These computers were truly a different medium than the glass house systems they would soon transform and, eventually, as servers developed that used the technology popularized by personal clients, largely replace.
By the 1980s, personal computers were evolving into a medium that encompassed data processing, added the potential for animated video and delivered audio content. These small computers accepted, in addition to keyboard data, tactile input from a mouse. They didn't provide a rich and immersive multimedia experience at first, but they were clearly headed in that direction.
IBM at the corporate level remained oblivious to the nature of personal computers even as it began to provide its own PCs to a hungry market. At the same time, IBM management didn't pay much attention to the impact the PC business was having on the semiconductor business. Imagine if Ford never paid attention to developments in the steel business. That was IBM during the 1980s.
Because IBM management seemed to only pay attention to PCs used in corporate offices and didn't notice how its employees used these machines in their homes and, perhaps more importantly, how its employees' children used the PC, it never saw that, once you took it out of a quiet office, the personal computer was a multi-media presentation device with a tremendous appetite for communications bandwidth. IBM didn't even figure out that lighter and smaller (but every more powerful) PCs were taking over the outside-the-office market until its rival were regularly eating large chunks of the market.
While IBM was putting its efforts into figuring out how to sell laptops made in China and desktop PCs made wherever the price was right, other companies were trying to pack the power of a PC into a phone. Nokia, Motorola, and others were going a pretty good job of this.
Meanwhile, all during the 1980s IBM management listed to its large account sales reps who said their customers wanted MIPS, MIPS, and more MIPS. IBM, thinking as linearly as its read-a-card, print-a-line old data processing systems behaved, interpreted this as a signal to build ever larger factories. Nobody was thinking about Moore's Law or, even if the IBM PCs used Intel chips, what Moore learned in the shark-infested memory business, to say nothing of the do-or-die-but-shrink-that-die processor business.
Driven more by the PC business than the glass house systems market, semiconductor companies in the US, Europe, Japan, and elsewhere in Asia were bringing down the cost and size of memory modules at a breakneck pace. In parallel, armed with ever-improving automated design systems, the processor makers were learning to put an awful lot of processing power onto a single chip, and how to mount multiple chips in very small but thermally robust packages.
IBM was building giant factories while its scientists and engineers, to whom management apparently paid practically no attention, learned to pack orders of magnitude more MIPS and huge amounts of memory into each frame. By 1990, one of IBM's key problems was its excessive inventory of manufacturing plants. It needed less space to build more glass house computers; it was unable to develop automation or other manufacturing technologies that would give it a leadership position in PC fabrication; and it failed to see that computing and mobile telephony were converging into a new medium that right now we call client device computing, but which might get a new name as we gain experience.
IBM management may now realize that its processor business is on the road to oblivion, but it may not have any viable options. The company's leaders and certainly their legal counsel ought to know that the only reason that there is a mainframe business at all is IBM's superb skill at blocking the efforts of competitors. These rivals, left unchecked, would long since have offered X86-based mainframe emulators that put the IBM z architecture into smaller boxes and, these days, frames that can live in cloud server farms. A similar situation exists in the Power market, which persists because IBM prepaid $1.5 billion in subsidies to Globalfoundries to create the illusion of affordable technology for its next generation or two of processor chips.
Just as the punch card medium became content for the 1401 and the 1401 became content for the mainframe, the mainframe could become content for the computing cloud, if it survives at all. But the back end system, whether mainframe or Power or IBM i or Oracle Sparc is only one aspect of future information technology, the way movies are only one aspect of the mobile tablet presentation environment.
IBM is promoting the idea that stuff it calls cognitive computing may help it enjoy a successful future. It is making a big effort to put its impressive Watson technology to work on vexing problems, such as the selection of cancer therapies. But so far the company hasn't found a way to ask Watson for help with some of the immediate matters that affect Big Blue's operations, such as the edginess of customers who worry about IBM's staying power, the effectiveness of employees who question the security of their positions, and the shareholders whose main case for investment is "well, Warren Buffett did it."
— Hesh Wiener May 2016