Humanist Discussion Group

Humanist Archives: Nov. 4, 2021, 7:18 a.m. Humanist 35.344 - an ethical 'great divide'

				                  Humanist Discussion Group, Vol. 35, No. 344.
        Department of Digital Humanities, University of Cologne
                   		Hosted by DH-Cologne
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org




        Date: 2021-11-03 12:55:05+00:00
        From: Henry Schaffer 
        Subject: Re: [Humanist] 35.341: an ethical 'great divide'?

Speaking from the programmer's/coder's point of view - one has to be
careful to define "outcome". If the purpose (as defined by the assignment
to the programmer is "get more clicks from users" and the outcome is a
higher number of clicks from users, then the correlation is high.

But then someone points out that the outcome also is that more computer
storage is used, or that there is some other change in user behavior (that
has some connection with ethics) - does that change the correlation? If the
ethical impact is dreadful, does that change the programmer's intent? I
suggest that it doesn't - that the Primum non nocere rule needs to apply to
the chain of supervisors, not to the programmer.

--henry

On Wed, Nov 3, 2021 at 4:46 AM Humanist  wrote:

>                   Humanist Discussion Group, Vol. 35, No. 341.
>         Department of Digital Humanities, University of Cologne
>                                 Hosted by DH-Cologne
>                        www.dhhumanist.org
>                 Submit to: humanist@dhhumanist.org
>
>
>
>
>         Date: 2021-11-03 08:24:59+00:00
>         From: Willard McCarty 
>         Subject: a 'great divide' in computational systems?
>
> I'd appreciate some well-informed help with the ethics of computing
> systems. In a recent post on SIGCIS, Paul Edwards drew attention to the
> correlation between purpose and outcome in these systems. In many
> instances familiar to us, the correlation is very close, so that we can say
> with confidence that their ethical neutrality is due to the purpose for
> which they were designed. When, however, computing systems become
> intimate with human conversations in the wild, as in social media,
> the designer's or implementer's purpose may become irrelevant, and
> ethics of the system highly problematic. Sure, we may say, 'guns don't
> kill people, people kill people' -- but you might respond, guns bring out
> latent behaviours of which everyone is capable. You might also refer to
> that great science fiction movie, Forbidden Planet, and leap from it to
> Shakespeare's The Tempest (on which the movie's script was based).
>
> Comments? In computing systems, is there an historical 'moment',
> however fuzzy, when they turned the ethical corner in this respect?
>
> Many thanks.
>
> Yours,
> WM
>
> --
> Willard McCarty,
> Professor emeritus, King's College London;
> Editor, Interdisciplinary Science Reviews;  Humanist
> www.mccarty.org.uk


_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php