home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!olivea!pagesat!spssig.spss.com!markrose
- From: markrose@spss.com (Mark Rosenfelder)
- Newsgroups: comp.ai.philosophy
- Subject: Re: COMPUTATIONAL BEHAVIOR MODEL
- Message-ID: <C17yDn.FIy@spss.com>
- Date: 21 Jan 93 19:34:35 GMT
- References: <93020.100600ICJGA@ASUACAD.BITNET>
- Sender: news@spss.com (Net News Admin)
- Organization: SPSS Inc.
- Lines: 33
-
- In article <93020.100600ICJGA@ASUACAD.BITNET> <ICJGA@ASUACAD.BITNET> writes:
- > Regarding emotions, the model does give the APPEARANCE of emotions, but
- >the appearance is the result of the emotions being there. That means that
- >the emotions must have computable definitions. For example, given a
- >situation that is "undesirable" (i.e. StoryPal has a "problem"), such as
- >hunger, a STRATEGY is available to deal with it. That strategy has in it
- >one or more PLANS. Each plan has in it a set of steps, the final one in
- >a plan being the absence of the problem. If StoryPal cannot get to any
- >point on any plan in the strategy we describe that as being "hopeless".
- >Certain expressions go with being hopeless giving an appearance of having
- >the emotion hopelessness, but it starts with a real emotion resulting from
- >a real situation. And being hopeless (not being able to get on a plan)
- >would definitely influence actions (even if itsuch action is to stand still
- >or run around in circles or lay down and cry).
-
- Just because some computational state occurs in your program and you
- call it an "emotion" doesn't mean that it's a "real emotion". You are
- just playing with words.
-
- As a few elementary questions, why do you label the state of not having a
- plan "hopelessness"? Why not "confusion", or "terror", or "frustration",
- or "anger at the programmer", or "quiet resignation", or "satisfaction at
- having done all one can do"?
-
- What is the point of such labelling anyway? Do you seriously believe
- you are modelling how human emotions work? Or does the "hopeless" program
- act just like a "hopeless" human being-- not just assuming a facial
- expression, but reducing its energy level even in other tasks, becoming
- aggressive or withdrawn, seeking distractions, contemplating suicide,
- reading a lot of Kafka?
-
- You're not doing yourself a favor by redefining the goals of AI so narrowly
- that you can claim to have already achieved them.
-