Home About Subscribe Search Member Area

Humanist Discussion Group

< Back to Volume 32

Humanist Archives: Feb. 8, 2019, 6:42 a.m. Humanist 32.435 - the McGann-Renear debate

                  Humanist Discussion Group, Vol. 32, No. 435.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                Submit to: humanist@dhhumanist.org

    [1]    From: Desmond Schmidt 
           Subject: Re: [Humanist] 32.428: the McGann-Renear debate (20)

    [2]    From: Martin Mueller 
           Subject: Re: [Humanist] 32.428: the McGann-Renear debate (35)

    [3]    From: Hugh Cayless 
           Subject: Re: [Humanist] 32.428: the McGann-Renear debate (98)

    [4]    From: Jan Christoph Meister 
           Subject: Re: [Humanist] 32.423: the McGann-Renear debate (54)

        Date: 2019-02-07 18:54:12+00:00
        From: Desmond Schmidt 
        Subject: Re: [Humanist] 32.428: the McGann-Renear debate

I agree with this bit from Herbert:

> But in the McGann/Renear-debate the question
> was, IMHO, another: Can I express my argument in a formal OHCO designed
> language? And I would answer NO.

It's good to see someone else who thinks that hierarchies do not cut
it. In the best of all possible worlds, if we could have a technology
that did anything we wanted would we need hierarchies? IMHO, no.

Desmond Schmidt
Queensland University of Technology

Dr Desmond Schmidt
Mobile: 0481915868 Work: +61-7-31384036

        Date: 2019-02-07 15:57:11+00:00
        From: Martin Mueller 
        Subject: Re: [Humanist] 32.428: the McGann-Renear debate

Various quotations come to mind in following this threat of relitigating an old
debate: "Hard cases make bad law" or "The squirming facts exceed the squamous
mind" (from Wallace Stevens' Connoisseurs of Chaos).  But there is also "as if":
if you treat a text as if it were an ordered hierarchy of content objects it
will often enough work well enough to be useful, especially if user communities
can agree on pretending in the same ways so that you can make use of my
pretences and I of yours without another layer of translation.

The TEI rules were first expressed in SGML, a technology developed by IBM.  They
now live in XML, another technology that mainly serves the interests of
business.  In "A Midsummer Night's Dream" Theseus curtly dismisses the offer of
an entertainment with  the title "The thrice three Muses mourning for the death
/Of Learning, late deceased in beggary" .  Humanists have always been parasites
at the tables of the rich and powerful, making  do with the scraps from their
table. Writing re-entered the Greek world  around 800 BCE as the adaptation of a
Semitic alphabet--almost certainly in the context of trade. In some ways the
humanists have had the last laugh:  the rare surviving fragments of Early Greek
writing include a hexametric line celebrating the Aphrodisiac powers of Nestor's
cup. And the Iliad would not be the poem we still read had it not been for the
business technology of writing.

Some of the complaints about the straitjacket of TEI are not unlike the
complaints Plato raised about writing. We're still doing it and are probably
better off for it. I share Hugh's pragmatic attitude towards the TEI. He is
right in saying that you cannot always draw a clear line between meaning and
rendering. There is a strip of increasingly wet sand on any beach, but it hardly
erases the distinction between wet and dry.

Martin Mueller
Professor emeritus of English and Classics
Northwestern University

        Date: 2019-02-07 15:34:02+00:00
        From: Hugh Cayless 
        Subject: Re: [Humanist] 32.428: the McGann-Renear debate


> It can be understood in the following way: After the "Ausforderung" (to
> come out of a placed behind the other plants; to be spoken resp. sung by
> actresses figuring female gardeners) another group of singing actresses 
> take over the speech, coming out while beginning to sing in the hidden 
> place. To speak with Hugh Cayles: Such a representation of text can be 
> seen as 'the dramatic text itself' (what is spoken on the stage by 
> allegoric and other characters) along with an argument how to understand 
> it. In other words: The typesetting bears meaning. Surely, we can discuss 
> if my understanding meets that of the author; how to decide? But in the 
> McGann/Renear-debate the question was, IMHO, another: Can I express my 
> argument in a formal OHCO designed language? And I would answer NO.

Here we necessarily bump into my limited understanding of the textual
tradition of Faust, and moreover of the text-critical concerns and
practices of those who study the text. As an aside, the latter is, if you
will, yet another dimension in the already multidimensional problem-set
Peter Robinson has described. But nonetheless, my answer depends a bit on
what you're asking. If it's whether both the speech and poetic (verse line)
hierarchies can be simultaneously preeminent, the answer is clearly "no".
If it's (more practically) whether you can do this in TEI, then "yes". The
Faust Edition folks have done it in fact, by deciding to treat the speech
hierarchy as dominant and to split the verse line into two. And this is
fine. I could take that TEI source and transform it into one where the
verse line hierarchy was dominant and the speeches were split.

> So far for today resp. tonight. I'm old and eye-handicaped, I must take
> care of my sanity. If you'll respond please answer some questions: 
> What kind of 'content object' is the mdash after "Locken!" What's his 
> meaning? Where to place in the tree? And take a look in the mark-up 
> done by "Faustedition".

Short answer: I don't know, and I'm not really in a position to know.
Uninformed speculation: It might indicate a pause, it might signal a shift
that prefigures the change in speaker, it might mean the singers are
beginning a parenthesis (which is then interrupted). It reminds me a little
of the ancient paragraphos mark, which could be used to indicate that a
speaker change occurs in the marked line. Is it part of the verse line or

If you're asking whether I could represent these differing possibilities in
TEI, then the answer is "yes". Whether it would be worth doing is another
question. I note that the Faust Edition folks just have an emdash inside
the  tags.

Responding to your second message:

> I really appreciate your well-reflected, pragmatic minded defense of
> real-world editions in a world of imperfectness. But the question here, 
> IMHO, is not: How good or problematic is the world of text processing today? 
> Instead, inspired by Barbara Bordalejo's position paper, I would prefer to 
> ask: In what respect is this world really better than the one 20, 30 or 50 
> years earlier? And, far more important perhaps: What wishable developments 
> were blocked by the adoption of industrial standards when the publishing 
> industries was voting for the OHCO-way of text processing?

This is a good question. I think the choice that was made was pragmatic and
allowed progress to be made faster in some areas than it might have been
otherwise. Obviously this is speculation, but my guess is that in an
XML-free world there might have been a split between the digital
typesetting (think LaTeX) and text-as-database approaches which didn't
happen because XML could work with both. That might have made for even more
of a split between those who want to approach textual criticism from a
digital angle and those who just want to make editions. In my own small
corner of the universe, the exciting developments in this area are mostly
recent, have come from a sustained engagement with the TEI, and would have
been immeasurably harder to achieve without it.

Would we be better off in an XML-free world? I doubt it actually. As Peter
has pointed out, this stuff is *hard*. TEI/XML maybe at least collapses a
couple of the dimensions of the problem so we don't have to worry about
them in addition to everything else. Because it is a hard problem, I'm not
sure we should expect very fast progress to have been made on it in any
world. Technology doesn't make everything easier.

> I'm looking for analogies. The world of programming languages was a bit
> better after moving from BASIC to PASCAL in studying and practicing 
> informatics. The world of data base management systems was going into better 
> times applying relational algebra and non-redundant storage of data. 
> Analogously publishing new written work was remarkably more effective and 
> reliable after the introduction of the SGML standard and its derivates, no 
> question. But hold this also for works of literary art? for the representation 
> of pre-existing texts standing in a long historical tradition?

I certainly think so. But again, progress takes time. In (North American)
Classics, we're really only now at the point where we're ready to seriously
start working on producing digital critical editions of major texts (the
sub-disciplines of epigraphy and papyrology were a decade or so ahead).
This is nearly four decades after we first had a digital version of most of
the corpus of ancient Latin and Greek! And these new editions, I should
note, are less complex than the sort of representation Peter has been
working on and has made very interesting progress with. I'm optimistic, but
there's still a *lot* of work to be done.

All the best,

        Date: 2019-02-07 08:34:25+00:00
        From: Jan Christoph Meister 
        Subject: Re: [Humanist] 32.423: the McGann-Renear debate

Dear William and Henry,

"somebody" has indeed come up with that approach about 10 years ago...
Our annotation platform CATMA (http://catma.de) uses external standoff
markup in a XML (and also TEI-compatible: we use the feature structure
tag function as a work around) format which allows us to handle all
sorts of overlap, including

   * categorial overlap
   * discontinuous annotation
   * single user conflicting annotations
   * multi-user collaborative annotations

CATMA is a webservice, so we handle all the data management. The backend
is currently being moved to a Gitlab/graph data base architecture.
Beyond the usual (inter operability, maintenance etc.) the functional
advantages of going this route include

   * the annotated source file is kept perfectly stable (no corrupt
     standoffs and pointers)
   * versioning support
   * multi-user collaborative annotation (either taxonomy based
     'top-down' or explorative free-style 'bottom-up' tagging)
   * seemless integration of analytical functionality across source
     document and/or annotations.

Colours, basic visualisations,  etc. -included. And we recently launched
a visualisation prototype which can input the CATMA annotation output
files and allows for them to be explored in-depth. On this see

Feel free to contact me off-list in case you want more detailed
technical information (also see https://github.com/mpetris/catma.)



Dr. Jan Christoph Meister
Universitätsprofessor für Digital Humanities
Schwerpunkt Deutsche Literatur und Textanalyse
Institut für Germanistik
Universität Hamburg
Überseering 35
22 297 Hamburg
+49  40 42838 2972
+49 172 40865 41

Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php

Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.