|HOME||PUBLIC LIBRARY||ANOTHER PERSPECTIVE||INFOPERSPECTIVES||CONTACT|
Commercial trends in every productive society sometimes run to excess. Information technology may be reaching such an apogee, getting just a little too complicated for its own good, and for the good of everyone it touches. It's starting to look like computing needs to get back to basics and, if possible, to do so in good taste. There's plenty of precedent: When Victorian Britain became overwhelmed by industrialization, it spawned the Arts and Crafts design movement. Reformist ideas subsequently crossed the Atlantic to inspire Frank Lloyd Wright and foster Mission style furniture. It's a notion that bears repeating.
The excessive complexity of information technology and in particular its software is quite apparent at three levels: for software professionals, for users within a business, and for customers, too.
Late in the Victorian era, towards the end of the nineteenth century, mass production by industry had largely overwhelmed the prior culture that built things by hand or manufactured goods in small workshops. Making products on an industrial scale reduced costs, encouraged standardization, and brought many items that were formerly luxuries within the reach of ordinary Britons. Similar developments occurred across Europe and in the US and Canada, too. But the changes in goods were not limited to differences in how things were made. The industrial culture also altered what was made and what it looked like. In Britain (and elsewhere) the mass production culture seemed to forget the beneficial influence of nature and of individualism on design.
The upshot was a movement among artists, designers, architects and other people in creative professions that was called Arts and Crafts. Suddenly fabrics and wallpaper, while made in factories, began to include natural themes. So, too, did furniture, clothing, and even whole buildings. This changed aesthetic very quickly caught on with the public, which somehow sensed the excesses of the industrial mentality and felt a need for some relief, for an inclusion of non-industrial values in daily life.
The result was not an anti-industrial movement, like that of the Luddites at the beginning of the same century that ended with the emergence of Arts and Crafts as a notable force in a changing culture. The emergence of a new aesthetic as an attempt to restore balance and to bring an element of human scale into greater prominence in a words that, for a while, seemed to have lost its interest in and respect for the individual.
If a comparable reaction occurred in computing, it would probably catch on, too. The computer industry seems to have lost touch with the people who use its products, and particularly with the fact that many people use computers because they absolutely have to. This might provide a basis for the companies that make computing systems and their software to feel they can do things the way they wish because to they reach what amounts to a captive audience. But that is unlikely. What seems more probable is that the people who shape computing believe, sincerely if inaccurately, that they are producing just what the market wants and needs, when in fact most of the time they are not actually doing a very good job when it comes to product development, design, and just plain paying attention to the folk at the far end of the wire.
Software specialists, including the ones who develop systems and applications as well as those who maintain code that others have developed, display excessive ambition to a dangerous extent. Applications are flawed. Commercial software is plagued not only by bugs and errors but also by security problems that are so pervasive even the most powerful vendors are no longer embarrassed. Microsoft, at the top of the heap, has made a routine out of monthly deployments of patches and treats this perpetual repair job as a normal part of using their code. Yet Microsoft is not alone; it is just one example.
Every software vendor has procedures in place to pass patches to customers. All the big ones have procedures in place that help customers learn about patches and develop ways to track which patches have been applied, which ones were rejected, and why.
Basically, software has become so complicated that even the patch process can be flawed, but users cannot afford to simply ignore updates even if they feel their installed software is working fine. Many of the most important patches are not remedies for visible bugs. They are protective patches that attempt to reduce threats to users' security.
Vulnerability concerns have become so common that the US government established a bureau devoted to the broadcasting of computer security alerts. The bureau maintains a web site that includes facilities to enable anyone interested in software security to get email or RSS notifications or vulnerabilities and vendors' defensive actions.
If the work produced by the software industry, which is well staffed with highly trained software professionals, is so fragile, imagine how much more vulnerable users are. Software specialists on the user side know a lot about the productive use of their systems and applications, sometimes more than many of their counterparts on the vendor side, but in general they don't know as much about security issues. This is not an indictment of user side programmers. It's simply a reflection of the fact that the vendor side software team has complete source code, while users don't.
The issue raised here is not merely that commercial software, even the big name products that bring vendors billions in revenue, is perpetually flawed. Users that customize code they license become part of the maintenance process. Users that buy packages and run them as is must devote the time of support and maintenance personnel to the care process.
The most serious defect in this system is not any particular software bug; it is the culture that has come to tolerate this situation. There does not seem to be much of a reward (if there is any at all) for software vendors that deliver higher quality software nor a penalty imposed on vendors whose software is seriously flawed. It might be unreasonable to expect perfection from software makers but it is just as unreasonable for us to tolerate what amounts to shoddy work, but apparently that is just what we do. We may try to explain the state of the industry by pointing to the main difference between computing today and computing ten or twenty years ago: the Internet. But that affects the outer layer of computing, and between our systems and the Internet face of our institutions there is a huge group of people who use computers every day, mainly or entirely because they have to: corporate end users.
There are many more people whose workdays (and, maybe, whose lives) are shaped by computing culture work than there are software developers, maintainers, and support personnel. These people once might have worked at green screens, the most prominent of which were the IBM 3270 of mainframe shops, the 5250 used in the IBM midrange, and the DEC VT100, the screen-based face of what was once the company that posed the most significant challenge to IBM's midrange systems. Today there are still plenty of applications systems with hearts and minds of green screen software, but user terminals present a graphical interface and their mode of interaction is based on characters and clicks, not collections of data processed a screen's worth at a time.
End users almost invariably use client devices that were built with certain kinds of applications in mind and which are used, perhaps erroneously, for markedly different kinds of jobs, too.
The kind of interface that seems to improve the productivity of people composing letters, adjusting photographic images, or building a spreadsheet might actually be a detriment when it is used to do the kinds of jobs that used to be called data processing. Entering orders or payments might in fact have been pretty efficient on old green screens, particularly for end users performing repetitive tasks. Basically, compared to a GUI, a green screen is simple and the people who built data processing applications knew (and still know, if anyone bothers to ask) an awful lot about end user productivity.
We believe that many computing shops that moved from green screen to GUI technologies actually suffered setbacks in productivity. This would particularly be the case if a GUI ran on a small screen with poor resolution. Basically, a GUI on a small screen (or even a large one with relatively low resolution) can be very hard to read compared to a green screen with text and that slows down end users and almost certainly reduces their accuracy. Put the same user and the same GUI on current displays, flat screens with diagonals of 19 inches or more, and there is a good chance productivity will rise.
NEC, which has a big interest in promoting displays, has sponsored research at the University of Utah that links screen size to productivity. This year NEC had better luck getting publicity for its findings than it did after a similar effort four years ago. The conclusions were pretty much the same: for work that is based on using a GUI, and in this case that means cranking spreadsheets, doing word processing and the like rather than entering payment data into an accounting system, a single large screen or a pair of large screens boosts the productivity of a computer user compared to the same person performing the same kinds of tasks on a screen that is modest in size by today's standards. The screens that Utah says were winners included a single 24-inch display and a dual 20-inch configuration. A single screen with a diagonal of 117, 18 or even 19 inches gives relatively poor results, with the smallest screen (about the size of a very large notebook display) apparently choking off the user's performance.
One surprising aspect of the results was the very substantial difference in performance on small compared to large displays. The Utah researchers say users with 17-inch displays who also used 24-inch displays were something like 40 percent more productive on the big screens. The Utah study also turned up another surprising conclusion: You can have too much of a good thing. Users with 30-inch displays did worse than those with 24-inch displays. A big falloff in productivity seems to set in when displays hit a 26-inch size.
We found no similar studies based on users who ran, say, SAP or PeopleSoft accounting applications on screens of various sizes, and that suggests that big GUI screens that are used for jobs that could, if one wished, be run on green screens don't necessarily deliver stuffing benefits. That has not stopped user organizations from moving their traditional green screen applications to versions that work with graphical displays. And there is probably some wisdom in this attitude. First, many users who need to access data on what used to be green screen corporate systems now sit at client stations that are also used to do work that is obviously helped by a nice GUI, work that includes email, document processing, and getting to various sites on the Internet.
But all this movement to GUI access to common business applications has occurred in a cultural milieu that is burdened by at least one unfortunate kind of excess: applications developers along with companies that make software to adapt traditional applications to the world of the GUI believe that they cannot succeed unless they present end users with screens that look like the screens on Microsoft products or, sometimes, the kinds of themes found in excess on the Internet.
Yes, even for users whose computing is done within corporate offices and which is primarily carried out with the help of applications systems that could look just about any way the user company wants them to look, the computing equivalent of what late Victorians came to see as industrial excess has taken hold.
Business applications that have been designed and, more importantly, exhaustively tested in a search for ways to make end users more productive are the exception, not the rule. Unfortunately, when applications are sold the marketing process has to rely on first impressions made by prospects who examine the screens a package throws at users. Companies can't find out how well an applications actually suits their end users until it's been purchased and installed and even then it takes quite a while before users acquire the new skills it takes to use the software.
This doesn't mean that most applications packages provide worse results than they ought to, but there are in fact plenty of signs that this could well be the case.
Anyone can go out on the Internet and see sites that are easy to read and many more that are really a challenge to the eye. Sites with relatively simple layouts also often provide a way for visitors to adjust the size of type that is displayed. (This seems to have become impossible or at least impractical for sites with complicated pages, such as the sites run by news organizations, including this one.) Some web browsers provide magnification, and some even do this pretty well, but the end user has to be a screen with pretty good resolution for this kind of trick to work at all. But this attempt to help visitors with weak eyes is not only a signal to web designers to stop coding everything for visitors with razor sharp screens and perfect eyesight, it is also a hint that some of the reaction to an overdose of industrial zeal may be taking root in computing.
After a decade or two of Arts and Crafts influence in Britain, and well after Art Nouveau as interpreted by Hector Guimard helped give the Paris Metro its distinctive entrances, America, which loved its industry, had an aesthetic reaction of its own. During the first decades of the twentieth century some of the most creative minds in the USA turned back to basics and away from the excesses that had begun to afflict architecture and related crafts associated with the construction of offices and homes. Two clear examples (out of quite a few possibilities) ought to be enough to explain the American way of bringing humanity back into the spaces where people lived and worked.
Frank Lloyd Wright designed a number of houses that were dubbed Prairie School buildings by architects, and the term seems to have come from Wright's effort to create homes that fit the world as he saw it looking toward the countryside from Chicago. Chicago, as much as New York, was a locus of industrial excess, and Wright tried to create homes that provided a balance, or perhaps relief, from the social trends that influenced the construction of factories, skyscrapers, and most residential structures. Basically, Wright did not scorn industrial modernity but rather tried to remind people to embrace nature, too. Just as Arts and Crafts designers in Britain created quite a few noteworthy items for use inside homes, Americans who were imbued with a kindred spirit built furniture that spoke with spare simplicity. One of the styles that arose, called Mission, was influenced by the Spanish architecture of the Old West, even when it was made in the industrial Northeast. L & J G Stickley, which has been making astounding products since 1900 in its factory at Manlius, outside Syracuse, New York, may be the last of the original Mission furniture makers still in business. The company's web site does two things that ought to be of value to the computer business, if the big guns took a moment to pay attention: It shows by example furniture that has a deep, reactive, lasting beauty. Also, it shows by example that a company might be able to design and produce some of the finest craft objects in the world and still have a heartbreak web site.
There might be applications suites about which similar things can be said, packages that on a daily basis bring corporate end users to the brink of tears.
Still, however troubled systems software and applications software may be, however overloaded with bad taste tendencies to excess that get in the way of real accomplishments, no part of computing provides more examples of how things have gone too far than the web.
The big talk in investment circles — or, if Wall Street continues to crumble, what may soon be the investment point — is about Web 2.0 and software as a service but the action is mainly silly stuff. (Maybe FIOS will do the trick but for now it's pretty inconvenient to manage a large spreadsheet or document as a web application.) The big direction in web creation is a move toward complex site development systems with names like Joomla! (star-crossed because of that exclamation point), Drupal, Mambo and WordPress. The upshot is a wave of new web sites that individually look great but also have a certain sameness that says factory rather than craft, even though the coding folk who make these new web creation systems work are very highly skilled.
It is here, on the Internet, that the computing world truly needs its Arts and Crafts, its Prairie School, and it's Mission designs. It is on the Internet that somebody, or perhaps quite a few people, will look at web sites that tailor their message to fit onto an inexpensive industrial web system (which is a great example of clever software technology) rather than the other, more natural, way around and ask, if they can find a voice, for something different.
We don't know when this will happen and we sure don't know what the reaction will be, but it is a save bet that it will soon arise. It might already be underway, but not yet apparent to people with our limited powers of perception. It might even appear just as the light of Western Civilization gutters, and lie hidden for years, awaiting rediscovery when the scholars of the next social epoch have time to pick through our ashes. That would be sad, but it would at least be consistent with that sporadically resurgent theme: Back to basics.
— Hesh Wiener July 2008