3.600 supercomputing and the way computers work, cont. (93)

Willard McCarty (MCCARTY@vm.epas.utoronto.ca)
Tue, 17 Oct 89 19:46:55 EDT

Humanist Discussion Group, Vol. 3, No. 600. Tuesday, 17 Oct 1989.

(1) Date: Mon, 16 Oct 89 22:19:16 BST (20 lines)
From: <Ron.Brasington@READING.AC.UK>
Subject: I'm NOT and I'm not a number.

(2) Date: Mon, 16 Oct 89 15:31 EST (54 lines)
Subject: more supercomputing

(1) --------------------------------------------------------------------
Date: Mon, 16 Oct 89 22:19:16 BST
From: <Ron.Brasington@READING.AC.UK>
Subject: I'm NOT and I'm not a number.

Who's misleading who? The processor in my machine works by
passing signals through LOGIC gates and the output of an AND
operation as sure as heck ain't a number - even if TRUTH
tables (sic) are sometimes (for convenience) filled out with
1's and 0's.

Ron Brasington.

Department of Linguistic Science,
Faculty of Letters and Social Sciences,
University of Reading,
PO Box 218,
Reading, RG6 2AA, UK.

(2) --------------------------------------------------------------60----
Date: Mon, 16 Oct 89 15:31 EST
Subject: more supercomputing

The issue of how humanists can use supercomputers has covered -- to one degree
or another -- the politics, the circuitry, and the applications, with more
heat than light in some regards. Yes, Michael S-McQ is mostly right (as is
Bob Amsler). Computers are physical mechanisms that retain information in a
two-state manner which can then be interpreted by the hardware and the software
and the users to mean what ever one wishes. One of the primary differences
between "supercomputers" (and I don't mean very large general purpose computers)
is that they have enhanced mathematical capabilities. As someone pointed out,
floating-point multiplication of complex numbers takes a lot more calculating
power than addition or subtraction. The issue is not how can the humanities
use mathematical capabilities, but rather what capabilities (mathematical or
otherwise) do humanities applications really need? And which humanities
applications need the kind of heavy-duty horsepower (how's that for value of
metaphor) that supercomputers provide?

The operations that are primarily performed in humanities computing
are text manipulation, such as comparison, sorting, rearranging, etc. pieces
of text data, from single characters to entire volumes. The low level operation
involved include comparison and the related test to determine the result of the
compare; simple arithmetic functions (addition, subtraction, multiplication, and
possibly division of integers); and character/string movement. Compare and test
of a single byte is a two or three cycle instruction; applying it to large
amounts of data compouds the execution algebraically. Arithmetic functions are
"housekeeping functions" used to keep up with where one it at each point. The
data movement operations are fairly straightforward, but also multiple
algebraically with increasingly larger amounts of data.

The problem therefore with humanities computing is not that we need
sophisticated arithmetic operations, but that we have to deal with large
quantities of text data in some reasonable amount of time. The supercomputers
in use today are certainly faster than my Compaq 386, but that's not where
their increased value in scientific work lies.

The kinds of applications that I could envision a very high speed
humanities computer being of value would be real-time language processing,
such as language understanding with immediate response. Some of the reasons
that general purpose language understanding is not available is the vast
amount of processing that must be done to parse, accessing vast lexicons (not
currently available) and multi-level knowledge bases about general world
knowledge as well as special domain knowledge. Trying to do all that within
the three seconds that a user will wait without getting antsy about the delay
warrants an increased computing capacity.

Mary Dee Harris
Washington, DC