Humanist Discussion Group

Humanist Archives: Feb. 8, 2023, 6:32 a.m. Humanist 36.385 - ChatGPT and trust

              Humanist Discussion Group, Vol. 36, No. 385.
        Department of Digital Humanities, University of Cologne
                      Hosted by DH-Cologne
                Submit to:

    [1]    From: James Rovira <>
           Subject: Re: [Humanist] 36.383: ChatGPT as author (48)

    [2]    From: Christian-Emil Smith Ore <>
           Subject: Re: [Humanist] 36.383: ChatGPT as author (18)

    [3]    From: Willard McCarty <>
           Subject: trusting machines and people (37)

        Date: 2023-02-07 17:20:34+00:00
        From: James Rovira <>
        Subject: Re: [Humanist] 36.383: ChatGPT as author

Thank you for the response, Tim. That's an interesting extension of
disclosure. For the things I write, I wouldn't feel the need to disclose
that I was using Word, for example. It's assumed that I do, or that I use
some kind of equivalent. Actual publication reformats the document using a
number of other tools, so in this case the final product, aside from the
text, would be out of the author's hands. If I publish something on the web
(except on my own blog), someone else codes it, or more likely uses an
editor of some kind, or Wordpress. If I publish something in print, the
publisher uses a number of tools through the proof-copy stage to generate a
final .pdf, but those are out of my hands too. The publisher would have to
disclose those tools. But if I were writing code? I think I'd think
differently about it, yes.

Are copy-editors coauthors? Conventionally, not.

Jim R

On Tue, Feb 7, 2023 at 1:59 AM Humanist <> wrote:

> A conversation about this is a class of PhDers I'm teaching
> lead to the idea that a fairer transparency might be to list
> all tools used in the production of written texts, including
> the word processor (MS Word, for example), or text processing
> (LaTeX, for example), the spell checking dictionaries used,
> the grammar checker used, and any diagramming or image
> processing tools used.
> This kind of tool use transparency has been a common practice
> in computer programming for a long time -- at least in corners
> I've inhabited -- where it is/was important to know the
> details of the hardware type, operating system, compiler,
> software libraries, programming environment, etc (including
> all version numbers) someone used to make a program.
> Doing this for text making will seem strange at first, but it
> might help people better understand that they are always tool
> using in the writing they do, and that ChatGPT is, at best,
> just another tool they may use.  And, knowing what tools have
> been used opens a dimension along which to assess the writing:
> students should be given credit for imaginative, creative,
> humorous, ironic, use of ChatGPT, not prohibited from using
> is.  (And quietly advised not to do silly things like name
> ChatGPT as a co-author.)
> Best regards,
> Tim

        Date: 2023-02-07 12:23:58+00:00
        From: Christian-Emil Smith Ore <>
        Subject: Re: [Humanist] 36.383: ChatGPT as author

Dear all,

The journal Nature has an interesting and informative article about Chat GTP as
a co-author. Nature  has stopped to accept ChatGPT as a co-author.  It can of
course be used,  but text produced by Chat GTP should be marked and given the
proper reference.

My impression is that neither the Hubble Space Telescope nor the accelerator in
CERN are listed as co-author, but referred to as tools. It is a general tendency
for humans to anthropomorphize  new gadgets.


Nature 613, 620-621 (2023)

        Date: 2023-02-07 07:23:56+00:00
        From: Willard McCarty <>
        Subject: trusting machines and people

Trust in devices and people who provide answers or responses has, I'd
imagine, always been important to whose who have consulted them, going
back to oracular, divinatory sources. We are apt to forget the "hard
work" involved in trusting someone or something, as Pascal Boyer has
written in response to Tanya Luhrmann's studies of religious conviction
('Why “belief” is hard work', HAU: Journal of Ethnographic Theory 3.3
(2013): 349-57). I was reminded of the problem this morning with the
notice that the U.S. National Academy of Sciences and the Nobel
Foundation are holding an international meeting, “Truth, Trust and
Hope”, for a global dialogue on how to stop misinformation from eroding
public trust in science: <>.

The current thread shows that the problem of trust goes beyond the 
natural sciences. The weakening of trust that person X wrote text Y 
pose a problem for the humanities and human sciences. If, as seems 
likely, Big Data collections of writings of all kinds continues to swell, 
will it not become ever more difficult to trust, say, student compositions, 
paper submissions to journals and so on? Then there's the problem of 
how representative the artificially collected vox populi will actually be.

The rapidity of textual exchange between actual people poses a parallel
problem, I'd think, by eroding the time needed for reflection and
careful argument. Take Twitter and the phenomenon of twitterstorms, for

It would be useful to have to hand careful studies of 'social media' in
a historical and anthropological context that take into account this
question of trust, or faith, if you will, in the reliability of what we
read and shape our lives with. Are there any of these?

Willard McCarty,
Professor emeritus, King's College London;
Editor, Interdisciplinary Science Reviews;  Humanist

Unsubscribe at:
List posts to:
List info and archives at at:
Listmember interface at:
Subscribe at: