Humanist Discussion Group

Humanist Archives: Jan. 31, 2023, 11:14 a.m. Humanist 36.366 - ChatGPT as co-author

				
              Humanist Discussion Group, Vol. 36, No. 366.
        Department of Digital Humanities, University of Cologne
                      Hosted by DH-Cologne
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org


    [1]    From: Tim Smithers <tim.smithers@cantab.net>
           Subject: Re: [Humanist] 36.362: ChatGPT as author (417)

    [2]    From: Henry Schaffer <hes@ncsu.edu>
           Subject: Re: [Humanist] 36.362: ChatGPT as author (9)


--[1]------------------------------------------------------------------------
        Date: 2023-01-30 20:31:05+00:00
        From: Tim Smithers <tim.smithers@cantab.net>
        Subject: Re: [Humanist] 36.362: ChatGPT as author

Hello

Below is a second interaction with ChatGPT on the question of
should it be a named coauthor.

This one is longer than my previous interaction, and, I would
say, not particularly more interesting for this.  It again
displays a tendency to apparently unnoticed repetition, and
retreats to using 'I' when it appears to apologise, which I
still object to.  And, as before, it appears to write
authoritatively about things it knows nothing about, with
statements like, "AI models such as ChatGPT do not have the
capacity for self-awareness, agency, or personal interests,
and therefore, cannot make decisions about authorship," for
example.  ChatGPT does not know about other "AI models," nor
what models are, nor what AI is, nor what self-awareness is,
etc, not really.  It just generates texts about these things
that can give the impression that it does know about them:
'plastic flower text,' in other words.

I regard any listing of ChatGPT as a coauthor as plainly
silly, to put it mildly and kindly, in part for the good
reasons others have posted here.  Nonetheless, with a mind to
being "more balanced," I decided I should try a more
pro-ChatGPT approach to an interaction on this matter, and
have a bit of fun with ChatGPT. (But please don't take my
parts of this interaction as reflecting what I truly think.
They don't!)

I now know several people regularly using ChatGPT to generate
text they need.  Prohibition, like in the USA from from 1920
to 1933, won't work. We might try transparency, and require
people to say what tools they used to produce their text.

Welcome to our New World of plastic text, plastic paintings
and graphic art, and, now, with Google MusicLM (for generating
'music' from text), plastic music too.

Best regards,

Tim

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

 An interaction with ChatGPT on 20, 23.01.29
 Using ChatGPT Jan 9 Version.

 TS: Should ChatGPT be named as a coauthor when it is used in
 some way to write something?

 ChatGPT:

  It depends on the specific context and extent of the use of
  ChatGPT. If the contributions of the model were minimal,
  such as providing information or answering questions, it may
  not be appropriate to list it as a co-author.  However, if
  the model played a significant role in generating content or
  shaping the final product, it may be appropriate to consider
  listing it as a co-author or acknowledging its contributions
  in another way.  Ultimately, the decision should be based on
  ethical considerations and academic norms in the relevant
  field.

 TS: But surely, if something contributes any text to
 something that is written, then it should be counted as a
 coauthor, just as any people who also contributed text to the
 writing would be counted as coauthors, no?

 ChatGPT:

  Not necessarily.  The question of authorship and who should
  be credited for a work can be complex and depends on various
  factors such as the level of creativity and originality
  involved, the extent of control and decision-making
  authority exercised, and professional norms and practices in
  the relevant field.

  In the case of AI-generated text, the extent of authorship
  and credit depends on the role played by the AI and human
  collaborators in generating the final product.  If the AI
  was used simply as a tool to assist in the writing process,
  without exercising significant creative or decision-making
  control, it may not be considered a co-author.

  However, if the AI was used in a way that significantly
  contributed to the content, such as writing original text
  based on a set of guidelines or inputs, it may be
  appropriate to consider listing it as a co-author or
  acknowledging its contributions in some other way.  The
  decision should be based on a careful evaluation of the
  specific circumstances and academic norms in the relevant
  field.

 TS: But why does being a coauthor depend upon things like
 creativity and originality?  Who decides these things, and
 what about if the text being written doesn't need much
 originality and creativity?

 ChatGPT:

  The criteria for determining authorship, such as originality
  and creativity, are based on established norms and practices
  in academic and professional communities.  These norms have
  evolved over time to recognize the contributions of
  individuals who have made a significant impact on a
  particular work.

  In fields such as science and engineering, for example,
  authorship is often based on the criteria of having made a
  significant contribution to the conception, design,
  implementation, or analysis of a study.  In the humanities
  and social sciences, the criteria may be more flexible,
  taking into account factors such as the extent of conceptual
  contributions, the role in shaping the final product, and
  the level of involvement in writing and revising the work.

  The decision of who should be considered a co-author is
  ultimately a judgment call that should be made by the
  individuals involved in creating the work, taking into
  account the specific circumstances and the norms and
  practices of their field.  If there is any ambiguity or
  disagreement about authorship, it is advisable to seek
  guidance from experts or relevant professional
  organizations.

 TS: But, if ChatGPT is involved in producing text used in
 some piece of writing, then it too should be involved in the
 judgement about who should be a coauthor, no?  Isn't ChatGPT
 one of the "individuals involved in creating the work," as
 you describe it?

 ChatGPT:

  AI models such as ChatGPT do not have the capacity for
  self-awareness, agency, or personal interests, and
  therefore, cannot make decisions about authorship.  As AI
  models, they are tools created and controlled by humans, and
  their use in generating text is guided by human input and
  control.

  In the case of AI-generated text, the extent of authorship
  and credit depends on the role played by the AI and human
  collaborators in generating the final product.  The decision
  of who should be considered a co-author should be made by
  the human collaborators involved in creating the work,
  taking into account the specific circumstances and norms and
  practices of their field.

  In some cases, it may be appropriate to acknowledge the
  contribution of the AI model in the work, for example, in
  the acknowledgments section or through a statement about the
  use of AI in the methodology.  However, listing an AI model
  as a co-author would be an unconventional and unusual
  practice, and its acceptability would depend on the norms
  and practices of the relevant field.

 TS: But, if there are human collaborators involved in
 producing some writing, then isn't ChatGPT also a
 collaborator, and, if ChatGPT is a collaborator, shouldn't it
 also be counted as a coauthor?

 ChatGPT:

  AI models such as ChatGPT are not typically considered
  collaborators in the traditional sense as they do not have
  the capacity for self-awareness, agency, or personal
  interests.  They are tools created and controlled by humans,
  and their output is generated based on the input and
  algorithms designed by human programmers.

  While AI models can assist in generating text, their level
  of involvement and the extent to which they contribute to
  the final product varies widely depending on the specific
  use case.  In some cases, AI models may play a limited role
  in generating text, while in other cases, they may have a
  more significant impact on the content.

  In the case of human-AI collaboration in generating text, it
  may be appropriate to acknowledge the contribution of the AI
  model, for example, in the acknowledgments section or
  through a statement about the use of AI in the methodology.
  However, listing an AI model as a co-author would be an
  unconventional and unusual practice, and its acceptability
  would depend on the norms and practices of the relevant
  field.

 TS: But ChatGPT's capacity to generate text is far greater
 than anything we have seen before, so don't we need to move
 on from conventional and usual practices of naming authors,
 to new practices that properly acknowledge the contribution
 of ChatGPT whenever it is used to produce some text?

 ChatGPT:

  The use of AI models like ChatGPT in generating text is a
  rapidly evolving field, and new norms and practices for
  acknowledging their contribution may emerge over time.
  However, it is important to consider the ethical and legal
  implications of crediting AI models as authors, as well as
  the broader social and cultural implications of attributing
  authorship to non-human entities.

  Currently, the norms and practices in many academic and
  professional communities do not consider AI models as
  co-authors, and the extent of their contribution to a work
  is typically acknowledged in a different way, such as
  through a statement in the methodology section or an
  acknowledgment in the introductory text.

  As the field evolves, it may be appropriate to re-evaluate
  and update existing norms and practices for acknowledging
  the contribution of AI models in generating text.  However,
  any changes to the norms and practices should be based on
  careful consideration of the ethical, legal, and cultural
  implications, as well as a thorough understanding of the
  role and limitations of AI models in generating text.

 TS: This seems like a rather conservative approach to an
 important issue, and, any way, what are the ethical and legal
 implications of counting ChatGPT as a coauthor?

 ChatGPT:

  There are several ethical and legal implications to
  considering AI models like ChatGPT as co-authors:

  1 Attribution of authorship: Attributing authorship to an
    AI model raises questions about the nature of authorship
    and the role of AI in generating text.  It may be
    difficult to determine the extent to which an AI model has
    influenced the content and style of a work, and who should
    be held responsible for any errors or inaccuracies in the
    work.

  2 Intellectual property rights: AI models are typically
    owned by the companies that develop them, and attributing
    authorship to an AI model raises questions about the
    ownership of the intellectual property rights in the work.

  3 Responsibility and accountability: Attributing
    authorship to an AI model raises questions about the
    accountability and responsibility for the content
    generated by the model.  In some cases, AI models may
    generate text that is harmful or offensive, and it may be
    unclear who should be held responsible for any such
    content.

  4 Legal and regulatory considerations: There may be legal
    and regulatory considerations related to attributing
    authorship to AI models, particularly with respect to
    copyright and trademark law.

  5 Social and cultural implications: The social and
    cultural implications of attributing authorship to AI
    models are also important to consider.  Attributing
    authorship to an AI model may have implications for the
    perception of the role of AI in society and the
    relationship between humans and technology.

  Given these ethical and legal implications, it is important
  to proceed with caution when considering attributing
  authorship to AI models and to carefully consider the
  implications of any changes to the norms and practices for
  acknowledging the contribution of AI models in generating
  text.

 TS: This all seems deliberately evasive.  All five points
 made here can quite simply be dealt with without generating
 serious difficulties, and, if some quaint old norms and
 practices need to be changed, what's wrong with changing
 them, when we need to?

 ChatGPT:

  I apologize if my previous response appeared evasive.  My
  goal is to provide a balanced perspective on the issue and
  to highlight the ethical, legal, and cultural considerations
  that should be taken into account when considering the
  attribution of authorship to AI models like ChatGPT.

  It is true that norms and practices can and should change
  over time in response to new developments and advancements
  in technology.  However, any changes to existing norms and
  practices should be based on careful consideration and
  analysis of the potential consequences and implications.

  In the case of attributing authorship to AI models, there
  are a number of complex and interrelated ethical, legal, and
  cultural considerations that should be taken into account.
  Before making any changes to existing norms and practices,
  it is important to have a thorough understanding of the role
  and limitations of AI models in generating text, as well as
  the implications of attributing authorship to non-human
  entities.

  The purpose of my response is not to defend existing norms
  and practices, but rather to provide a nuanced and informed
  perspective on a complex issue, and to encourage careful
  consideration and discussion of the ethical, legal, and
  cultural implications of attributing authorship to AI
  models.

 TS: These responses are anything but balanced.  They only
 propose a conservative "don't change anything" view, and
 don't seem to give any consideration to the fact that ChatGPT
 has shown it can generate good text, and thus should be
 counted as a coauthor when it does.

 ChatGPT:

  I apologize if my previous responses came across as
  one-sided or overly conservative.  As an AI language model,
  I strive to provide balanced and impartial responses to all
  questions.

  The issue of attributing authorship to AI models like
  ChatGPT is complex and multifaceted, and there are arguments
  to be made on both sides.  On the one hand, AI models like
  ChatGPT have demonstrated the ability to generate
  high-quality text that can contribute to written works.  On
  the other hand, there are ethical, legal, and cultural
  implications to attributing authorship to non-human entities
  that should be taken into account.

  Ultimately, whether or not to attribute authorship to AI
  models in a given context is a question that will depend on
  the specific circumstances and the goals of the parties
  involved.  There is no simple answer, and it will likely be
  a matter of ongoing debate and discussion as AI continues to
  advance and become more integrated into various aspects of
  society and culture.

 TS: This is, yet again, one sided hand waving.  There is
 still no attempt to explain why we should simply not just
 call ChatGPT a coauthor whenever it is used to generate text
 for some writing.  There seems to be nothing difficult about
 this question, so why do we have all this repetitive going
 around the houses?

 ChatGPT:

  I apologize for not addressing your question more directly.
  Let me attempt to provide a clearer response.

  The use of AI models like ChatGPT to generate text for
  written works raises important questions about authorship
  and the attribution of creative credit.  The issue is
  complex and there is no easy answer.

  In considering whether to attribute authorship to an AI
  model like ChatGPT, it is important to take into account
  ethical, legal, and cultural considerations.  For example,
  attributing authorship to an AI model may raise questions
  about the role and responsibility of the human authors
  involved, as well as the potential legal and commercial
  implications of doing so.

  At the same time, it is also important to acknowledge the
  contributions of AI models in the creation of written works,
  and to consider whether new norms and practices should be
  developed to recognize these contributions.

  Ultimately, the decision of whether or not to attribute
  authorship to an AI model will depend on the specific
  circumstances and the goals of the parties involved.  It is
  a complex issue that will likely be the subject of ongoing
  debate and discussion as AI continues to advance and become
  more integrated into various aspects of society and culture.

 TS: This is all just a repeat of previous responses.  There
 is still no explanation for what all these "ethical and
 legal" issues are, and why they get in the way of calling
 ChatGPT a coauthor when it is a coauthor of any text.

 ChatGPT:

  My apologies for repeating previous responses.  To answer
  your question more specifically, there are a number of
  ethical and legal considerations that may impact the
  decision of whether or not to attribute authorship to an AI
  model like ChatGPT.

  From an ethical perspective, attributing authorship to an AI
  model may raise questions about the role and responsibility
  of the human authors involved, as well as the meaning and
  value of creative work in a world where machines are
  increasingly capable of generating text.

  From a legal perspective, attributing authorship to an AI
  model may raise questions about copyright and ownership of
  the resulting work, as well as the potential liability of
  the human authors involved.

  In addition to these ethical and legal considerations, there
  may also be cultural factors that impact the decision of
  whether or not to attribute authorship to an AI model.  For
  example, there may be societal norms and conventions that
  dictate how authorship is attributed and who is recognized
  as a creative contributor to a written work.

  Taken together, these ethical, legal, and cultural
  considerations create a complex and evolving landscape for
  the attribution of authorship in the context of AI-generated
  text.  It is a matter that is likely to be the subject of
  ongoing debate and discussion as AI continues to advance and
  become more integrated into various aspects of society and
  culture.

 TS: Yet more defensive handwaving. I give up!

 ChatGPT:

  I apologize if my previous responses were unsatisfactory.
  My goal is to provide accurate and helpful information to
  the best of my ability.  If you have any other questions, I
  would be happy to try and assist you.


--[2]------------------------------------------------------------------------
        Date: 2023-01-29 16:31:54+00:00
        From: Henry Schaffer <hes@ncsu.edu>
        Subject: Re: [Humanist] 36.362: ChatGPT as author

Thanks to Tim Smithers for an excellent excursion into ChatGPT land.

However, this leads me to think of discussing this with a student - who
claims that the ChatGPT generated essay handed in by the student really was
written by the student, since ChatGPT has disclaimed authorship. Hmmm. :-)

--henry




_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php