11.0105 computing: its present and future

Humanist Discussion Group (humanist@kcl.ac.uk)
Fri, 13 Jun 1997 20:22:23 +0100 (BST)

Humanist Discussion Group, Vol. 11, No. 105.
Centre for Computing in the Humanities, King's College London

[1] From: Charles Ess <DRU001D@VMA.SMSU.EDU> (115)
Subject: Re: 11.0101 future of computing

[2] From: Haradda@aol.com (4)
Subject: Re: 11.0101 future of computing

Date: Fri, 13 Jun 97 07:39:54 CDT
From: Charles Ess <DRU001D@VMA.SMSU.EDU>
Subject: Re: 11.0101 future of computing

On the occasion of Hope Greenberg's question about the future of education
and the university vis-a-vis computing, but also simply because I was
going to recommend it to HUMANIST readers in any case, I would like to
call your attention to the July Issue of Scientific American, which
contains three articles of interest to computing humanists.

One very short piece is a graphic which shows the concentration of Internet
hosts/population throughout the world (p. 26). The map makes very clear
that the vision of a "global electronic village" is at a considerable
remove from a current infrastructure concentrated most heavily in North
America, Scandanavia, and Australia.

A second piece, a marvelous interview with Michael Dertouzos of M.I.T.
attacks what he calls five myths of the Information Age. One comment
may be especially comforting to humanists who find the rate of change
a bit dizzying at times:

Myth: The information revolution is moving too quickly for most to
keep up.
Dertouzos: "We've been four decades into the business, and we've
hardly done anything. The second industrial revolution
took nine decades. So relax." (29)
Dertouzos also announces that it's time we put technology and humanism
back together!

Finally, an extensive article by W. Wayt Gibbs documents the hard look
businesses have been giving to the claims that computerization will
lead to greater productivity. (For the purposes of this article,
education is included as a business.)

Gibbs introduces his article by observing:
the [information] explosion is well under way, and its economic blessings
so far appear decidedly mixed. For all the useful things computers do,
they do not seem, on balance, to have made us much richer by enabling us
to do more work, of increasing value, in less time. Compared with the big
economic bangs delivered by water-, steam- and electricity-powered
machines, productivity growth in the information age has been a mere
whimper....Recent studies of computer use in offices reveal that much of
the time saved by automation is frittered away by software that is
unnecessarily difficult, unpredictable and inefficient. Design experts
warn that current industry trends toward increasingly complex programs and
new, untested ways of presenting information could do more harm than good
- and will almost certainly do less good than advertised. (82)

Notably, productivity gains have _dropped_ from 4.5% in the 1960s to
1.5% more recently - and these drops have occurred most significantly
precisely in those industries that have invested the most in information

In particular, many of the cutting-edge technologies dear to the hearts
of the enthusiasts among us who confidently predict the quick end of the
brick-and-mortar colleges - virtual reality, autonomous software agents,
speech recognition/understanding, the Web, and videoconferencing - are
not proving themselves to be useful technologies at least as far as
business is concerned.

In addition, the actual costs of introducing a simple PC into the
workplace appear to be far higher than we ordinarily admit - for a
$3,000 desktop machine, one group estimates the total costs are ca.
$23,500. These costs include technical support, the loss of time
as coworkers help one another with computer-related problems, and
"futzing" - waiting for programs to run and/or help to arrive, double-
checking printouts for accuracy and format, rearranging disk files,
playing games, and going over that presentation software just one
more time to make sure all the nifty effects are just right. (It
is rumored that Sun Microsystems banned its managers from using
presentation software to make slide presentations for meetings: 87).

Part of the discrepancy between promise and reality may be explained
by what academics would call assessment: it is notoriously difficult
to assess what "productivity" in education might mean, much less
determine how much of any demonstrated gain might be traced to computers.

On the bright side (for us computer enthusiasts): productivity gains can be
demonstrated for computer uses that involve human-factors engineering to
custom fit interfaces and applications to specific tasks. But this lesson
is being learned only slowly: in a recent computer-human interface
....only nine of those 83 projects compared workers' performance on real
tasks using the new interface with their current way of doing things.
Four offered no gains at all. Radiologists completed their reports faster
without the computer. Video offered no improvement over audio for
collaborative writing or design. Only three new interfaces - an
interactive blueprint program, the combination of a keyboard joystick with
a mouse for two-handed input, and a "wearable" computer - sped work
significantly. (89)

While this report focuses on productivity gains in business, some
analogies may be drawn for computer-use in education. At least, this
report reinforces my concern that students can waste far more time on
computers - between game-playing and spiffing up their web pages - than
they may spend taking advantage of the many tremendous learning
opportunities made possible by computers and networks. It further
reinforces my sense that we run into serious questions of balance the more
we focus on the _means_ of presentation (Web pages, PowerPoint, etc.)
vis-a-vis the _content_ being presented.

Finally, all of this reinforces my sense that academics - e.g., the
President of a major state university down the street - who think that the
future of education lies in "every faculty member putting his/her courses
on the Internet" are painting with very broad and highly misleading
strokes. My own experience - ranging from disappointing experiments with
videoteleconferencing to somewhat successful (because, this article
suggests, they were highly focused and specific) uses of e-mail and the
Web - tells me that these technologies can indeed powerfully supplement
more traditional classroom approaches (including readings, discussions,
indeed - lectures!). But only supplement, not replace.

If this is true, those institutions that have jumped on the distance
learning boat in the belief that the Internet and the Web will replace
bricks-and-mortar institutions may have jumped just a wee bit too soon.

I strongly recommend Gibbs' article to anyone in the computing/humanities
world who is put in a position of having to justify the costs of acquiring
new technologies (all of us?). While some of us will not agree with
every point, and serious questions need to be raised about the analogy
between business and education - at least we need to know what our very
business-minded administrators may be thinking when they review our next

Appropriate to this audience, I've made some extensive notes on the Gibbs
article at
This will give a strong sense of the article, but omits a great deal of
Gibbs' supporting evidence and examples.

Cheers -
Charles Ess
Drury College
Springfield, MO 65802 USA

Date: Fri, 13 Jun 1997 10:09:54 -0400 (EDT)
From: Haradda@aol.com
Subject: Re: 11.0101 future of computing

The future is not as far away as you might think. I have a CDROM product
that is call the LDS Collectors Library 97 (That is a folio infobase product)
where I can type in a topic,thought,phrase and it will bring up all the hits
in its library. I would say that its about 75% to becoming all enclusive in
its area.