home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.philosophy
- Path: sparky!uunet!stanford.edu!Csli!avrom
- From: avrom@Csli.Stanford.EDU (Avrom Faderman)
- Subject: Re: Brain and mind (killing boring wife)
- Message-ID: <1992Nov17.235643.21480@Csli.Stanford.EDU>
- Organization: Stanford University CSLI
- Date: Tue, 17 Nov 1992 23:56:43 GMT
- Lines: 96
-
-
- In article <1992Nov11.172601.632@cine88.cineca.it>
- avl0@cine88.cineca.it writes:
-
- | In article <1992Nov4.234723.22038@Csli.Stanford.EDU>
- | avrom@Csli.Stanford.EDU (Avrom Faderman) writes:
- ...
- |> Imagine a society
- |> that was generally evil. All the customs, all the mores of this society
- |> are complete inverses of our own--wanton torture is considered a perfectly
- |> acceptible way to pass an afternoon. I think even a hard-core anti-
- mechanist
- |> would agree that, without exposure to other societies, it would be next to
- |> impossible for any member to abandon these evil ways. But does this mean
- |> that they are not morally responsible for their actions?
- ...
- | Moral judges actions in relation with a Fundamental Goal.
- | The society you have described is a moral one if the Fundamental Goal of
- its people
- | is to have a funny afternoon.
-
- Well, I guess this is a matter of intuition. By my lights though (and
- I think most people will agree with me on this one), a society that
- gleefully tortures innocent people in a neverending quest to have a
- funny afternoon is an immoral one.
-
- | Responsibility is merely a consequence of free-will, no matter if the action
- | is moral or not.
-
- Even were I to accept your previous claim about Fundamental Goals (and I
- don't), I don't see how it would obligate me to accept this. It still
- doesn't strike me as obvious (or even true).
-
- |> it is by
- |> no means clear to me that you could sell or buy a robot that had
- |> considerable functional similarity to a human.
- ...
- | you acknowledge that human beings have some value,
- | you try to derive it from his functionality. Then, robots which have a
- similar
- | functionality, have the similar value.
- | Let me be a little presumptuous, I have just built the dreamed android.
- | It's Me its creator, really are you saying that I can not switch it off?!
- | *I* wanted it to behave like my self, *I* wanted it to have e.g.
- electronic
- | dreams to re-organise acquired information, *I* want it to stop living,
- why not?
- ...
- | Old Romans were much coherent with a materialistic view of Man.
- ...
- | Hitler also was coherent: mad people have less functionality (or not at all)
- | than others humans THEN they are less (or not at all) human than the others,
- | and he killed them.
-
- My fault for using a slightly ambiguous term (functional). I don't mean
- the organism's _use_ to society. I mean internal structure. If an
- organism has internal states corresponding to pleasure and pain in
- humans, and these internal states interact with one another, the
- organism's behaviour, and the rest of the organism's internal states in
- the way that human pleasure an pain interact with the analogous human
- structures, the organism is functionally equivalent to a human (the way
- I'm using the term).
- I said in the previous post that I wasn't going to try to give a positive
- argument that functional similarity to a human was sufficient for being
- a moral patient, but I doubt you would consider it intuitive. So, here's
- a sketchy attempt at an argument.
-
- When I say of something that it is moral, I don't mean it jibes with the
- Fundamental Goal of my society (which I may disagree with) or with some
- Platonist Form of the Good (which I have no evidence exists). I am
- instead expressing my approval of it--I am letting it be known that
- I have a reaction of applause when I contemplate it. Now, what
- kinds of actions trigger (internal) applause in me? Or rather, more
- to the point of the "Killing Boring Wife" example, what kinds of actions
- trigger outrage in me?
- When I view another entity, and see that it has all the internal workings
- that I identify in myself, it is quite natural and involuntary that I
- empathize with it. When I witness its pain, I imagine _myself_ in
- such pain, and I am strongly inclined to try and put a stop to it (by
- locking the perpetrator up, for example). This, I would advocate, is
- much of the basis of morality--witness the fact that the first step
- in inspiring people to commit atrocities has historically been the
- alienization and dehumanization (the denial of functional similarity) of
- the prospective victim.
- When I empathize with the robot you create, the knowledge that you made
- it does nothing to diminish this, any more than if you were a genetic
- engineer that created a human from basic organic molecules, the
- knowledge of this would kill my empathy with the product of your
- efforts.
-
-
- --
- Avrom I. Faderman | "...a sufferer is not one who hands
- avrom@csli.stanford.edu | you his suffering, that you may
- Stanford University | touch it, weigh it, bite it like a
- CSLI and Dept. of Philosophy | coin..." -Stanislaw Lem
-