Humanist Discussion Group

Humanist Archives: Feb. 25, 2021, 9:25 a.m. Humanist 34.237 - A hermeneutics of computer engineering?

              Humanist Discussion Group, Vol. 34, No. 237.
        Department of Digital Humanities, University of Cologne
                      Hosted by DH-Cologne
                Submit to:

        Date: 2021-02-24 16:32:04+00:00
        From: Willard McCarty <>
        Subject: a hermeneutics of computer engineering?

In "Analysis of scientific and philosophical texts"*, Roberto Busa
writes of hermeneutical interpretation as a set of scientific strategies
for explicating "the net of thoughts and stages that generated the outer
expression", constructing "scientific strategies for deprogramming the
inner living parts of the vital thinking which originated and authored
the text."

These words have a hold on me because, I suspect, of what that word
'deprogramming' (along with 'net', 'stages' and 'parts') does to an
otherwise unremarkable statement. For one thing, it outlines what I'd
like to think a hermeneutics of digital computing would do on both the
levels of software and hardware. There are other, perhaps many other,
ways of going about this hermeneutics than I am attempting at present.
I'm trying to do a hermeneutics of the engineering involved in the
design of  computing systems from microprocessors to operating systems.
Not having time to go back to school, I'm looking for sources of
explanation simple enough for me to understand yet complex enough not to
be excessively lossy. Simple but not too simple.

Not an easy assignment. On the design of the hardware the best I have
found are these:

(1) Two interviews with chip designer Jim Keller, conducted by Lex
Fridman, and available on YouTube at
which I strongly recommend as worth close attention. He is astonishing. 
If only I could get him to write a book of the sort that these interviews suggest!

(2) John L. Hennessy and David A. Patterson, Computer Architecture: A
Quantitative Approach, 4th edn. (2007)

On the design of operating systems, the sources that seem most helpful
are those which deal with abstraction layering and information hiding,
esp those that admit Joel Spolsky's "Law of Leaky Abstractions". But
what I do not find are workings out of the consequences of such leaks at
the level at which we normally work. We all know Edward Sapir's
linguistic proverb "All grammars leak", but in the case of natural 
language we have no trouble finding examples of how normative 
grammars fail and so urge our understanding on. How about the 
grammars of circuitry and code?

I'm not at all sure anyone can make sense of this, but if there are any
suggestions of where to go with it, they'd be most welcome!


*In Dino Buzzetti, Giuliano Pancaldi and Harold Short, eds. Augmenting
Comprehension: Digital Tools and the History of Ideas.  Proceedings of a
conference at Bologna, 22-23 September 2002. London: Office for
Humanities Communication, 2004.

Willard McCarty,
Professor emeritus, King's College London;
Editor, Interdisciplinary Science Reviews;  Humanist

Unsubscribe at:
List posts to:
List info and archives at at:
Listmember interface at:
Subscribe at: