4.0404 Nature of Computers (4/163)

Elaine Brennan & Allen Renear (EDITORS@BROWNVM.BITNET)
Wed, 22 Aug 90 15:47:39 EDT

Humanist Discussion Group, Vol. 4, No. 0404. Wednesday, 22 Aug 1990.

(1) Date: 22 Aug 90 10:12:00 EDT (62 lines)
From: "Mary Dee Harris" <mdharris@guvax.georgetown.edu>
Subject: Computers -- 0's and 1's

(2) Date: Tue, 21 Aug 90 22:45:42 EDT (13 lines)
From: Frank Dane <FDANE@UGA>
Subject: Re: 4.0400 Computers -- Names for and Nature of

(3) Date: Wed, 22 Aug 90 00:00:26 +0100 (45 lines)
From: iwml@ukc.ac.uk
Subject: Computers

(4) Date: Tue, 21 Aug 90 23:44:16 GMT+0100 (43 lines)
From: macrakis@ri.osf.fr
Subject: 4.0395 Why "computers"; Jim Sledd (2/74)

(1) --------------------------------------------------------------------
Date: 22 Aug 90 10:12:00 EDT
From: "]" <mdharris@guvax.georgetown.edu>
Subject: Computers -- 0's and 1's

Speaking as a long-time Computer Science prof, I want to clear up some
misunderstandings. Yes, we teach that the electrical circuits operate
in either an off or on state, but that's really an oversimplification.
Electricity does indeed work a bit like water flowing through pipes (not
exactly but close enough for this discussion) in that the level of flow
is not nothing or complete. The 0/1 distinction comes from the level
passing a certain threshold -- below that level is 'off' and above that
level is 'on'. Exactly at the level chosen, is a bit fuzzy because it
switches from one state to another, but the change is quite small.

The 0/1 distinction was chosen as a convenience. Very early computer
designers contemplated using a decimal system rather than binary, but
found that the fluctuation in the circuits (did you just see the lights
dim slightly?) cause the 7's to become 5's, slide up to 8's, before
settling back to 7's. It was essentially an even worse case of the
situation described in the previous paragraph.

With regard to the early history of computing, yes, much of the early
programming was mathematical, but not all. Remember Alan Turing (who
was a mathematician and probably left-handed) can be considered the
father of artificial intelligence. The famous Turing test had nothing
to do with mathematical calculations, but communication with a human in
such a way that the human could not detect the computer. I've often
thought that the metaphor of numbers in the computer was misleading.
Calculations are actually done by switching circuits on and off, not by
some finger-counting 1, 2, 3, ... method. (I used to teach assembly
language programming which required teaching the underlying circuitry of
the registers, etc.)

With regard to the analog/digital discussion of watches, it is very much
the same issue as well. Time is analog as is sound. Thus a digital
representation of time is breaking the continuum up into tiny chunks.
For philosophical (and esthetic) reasons, I've always worn an analog
watch, in fact one that has no marks (except for a gold ball at the 12
o'clock point). At the pool recently I wore an old watch with small
marks at the quarter hour points and found it difficult to tell time
with. (We certainly are creatures of habit, aren't we.) I decided some
time ago that at this time of my life, I don't need to know the time
more precisely than what I can see by the position of the hands. The
plane may be scheduled for 12:03, but I know I have to be there ahead of
time anyway, and it probably won't leave on time.

Digital recording of sound is better than analog recording because of the
medium and the equipment, n ot because of the nature of sound.
Phonograph records and needles must be of very high quality to produce
excellent sound, and they are subject to damage. Cassette tapes get
crinkled and stretched. But digital recording such as on a CD is
stable. Producing digital media is less error prone because of that,
and errors in the recording can be corrected during production.

So much for the discussion of analog/digital. I really should finish my
paper for the September conference. Office hours will be held later.

Mary Dee Harris
mdharris@guvax.bitnet
mdharris@guvax.georgetown.edu

(2) --------------------------------------------------------------20----
Date: Tue, 21 Aug 90 22:45:42 EDT
From: Frank Dane <FDANE@UGA>
Subject: Re: 4.0400 Computers -- Names for and Nature of (4/109)

One advantage to the name "computer" is that it is descriptive.
Whether dealing with numeric or alpha representation, my (simple)
understanding of the CPU is that it adds and subtracts, period.
Thus, even when processing text all that is going on is a great
deal of adding and subtracting, computing numbers. It's all
the software that makes those binary numbers text, graphics,
or (my favorite) numbers. What's wrong with description?

Frank Dane, Mercer University
(3) --------------------------------------------------------------57----
Date: Wed, 22 Aug 90 00:00:26 +0100
From: Iam Mitchell Lambert <iwml@ukc.ac.uk>
Subject: Computers

I note there is a discussion on the word 'computer' currently filling the
air. In a short paper recently I was pondering whether a new name would
be needed for future developments. As I understand it, we only have
computers which are binary based because that is how we invented them in
the early days. I gather some research departments have developed other
'computers' with other bases.

'Computer' is also approaching interchangeability with 'IBM' and 'Mac',
in much the same way as 'hoover'.

To what extent is our current binary computer system a form of Western
colonialism? It matches Western binary thought. It insists that other
linguistic systems adapt themselves to it. It appears to have a
commercial monopoly in both hardware and software worldwide.

Ian Mitchell Lambert PhD research student
Tangnefedd Department of Theology
Windmill Road University of Kent at Canterbury
Weald United Kingdom
Sevenoaks
Kent Co-ordinator
TN14 6PJ AIBI Network
(Association Internationale
Bible et Informatique, Maredsous,
Belgium)

Telephone (UK): 0732 463460
(international): +44 732 463 460
Email JANET: iwml@uk.ac.ukc
EARN/BITNET: iwml@ukc.ac.uk
or iwml%uk.ac.ukc@ukacrl
Fax 0732 741475 (overseas +44 732 741475)

Microlink mailbox (available via Dialcom and Goldnet (Israel) and JANET)
MAG33187

(4) --------------------------------------------------------------61----
Date: Tue, 21 Aug 90 23:44:16 GMT+0100
From: Stavros Macrakis <macrakis@ri.osf.fr>
Subject: 4.0395 Why "computers"; Jim Sledd (2/74)

Subject: Do computers compute?

Jim O'Donnell, you bring up a number of interesting issues, and manage
to tie them into such a knot that I have some difficulty in responding
coherently. I think I understand your complaint: on-campus computer
services don't respond to humanists' needs. But I'd hardly attribute
this to the baneful influence of the root "compute", and certainly not
the digital nature of computers.

Software is simply very primitive. You give the example of the
limited character set available in E-mail. Fixing this -- it is
called Internationalization in the biz -- is going forward, but
slowly, despite the enormous international business market. The poor
matter, the physicist would love to include formulae and graphs in his
E-mail...) ((You can get most of this functionality today on NeXTs,
but only between NeXTs ....))

Have some pity for the hard scientists. Since the machines were
E-mail and word processing when it would have been prohibitively
expensive to buy a computer for that alone. Our expensive playthings
have become your daily utility, but are distorted by our tastes.
Patience.

Digital computers are not philosophy. They're a specific technology,
based on discrete mathematical structures implemented with digital
electronics. Most AI people are not struggling against this; they are
trying to build on top of it. Some researchers like other models
("neural nets") for certain things. Neural nets will not fix your
E-mail problems.

Stavros Macrakis
Grenoble

PS informat -ics, -ique, -ica, -ik in English, French, Italian,
German; pliroforiki in Mod.Greek (same meaning)