home *** CD-ROM | disk | FTP | other *** search
- Comments: Gated by NETNEWS@AUVM.AMERICAN.EDU
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!darwin.sura.net!paladin.american.edu!auvm!FAC.ANU.EDU.AU!ANDALING
- Message-ID: <9209012348.AA10095@fac.anu.edu.au>
- Newsgroups: bit.listserv.csg-l
- Date: Wed, 2 Sep 1992 09:48:16 EST
- Sender: "Control Systems Group Network (CSGnet)" <CSG-L@UIUCVMD.BITNET>
- From: Avery Andrews <andaling@FAC.ANU.EDU.AU>
- Subject: Planning and Agre's Cooking Problem
- Lines: 55
-
- [from Avery Andrews (920902.0914)]
- (Thomas Baines 92090909)
-
- > I think Avery almost has it.
-
- Spurred on by this, I'll dump the rest of this file. Before Penni
- introduced me to Interactive AI, but after I started getting CSGNet,
- it struck me that there was something very wierd about the discussions
- of the `frame problem' in my AI books, because they seemed to be
- concerned with the problems faced by programs that attempted to imagine
- everything that might happen in some domain that they had no practical
- experience in.
-
- By contrast, real planning, as carried out with some modicum of success
- by real people (we have 4 people, 1 car, 7 hard commitments & 3 wishes --
- how are we going to get thru Saturday?) is carried out in domains where
- people have a lot of practical experience. Consider a typical operator
- (= step in a plan). In addition to its desired effect, it will have
- various side-effects, which can be grouped into three types:
-
- a) irrelevant (when I execute DRIVE(OWEN,BELCONNEN), the place on the
- driveway where the car is normally parked will get wet if it
- rains).
-
- b) relevant, but routinely controllable (as a result of the driving, the
- fuel-level of the car will decrease). One of the effects of culture
- is to increase the routine controllability of side effects.
-
- c) relevant, but uncontrollable. (while I am doing the driving, there
- won't be any car at home, so no other person can do any driving).
-
- So the real-world planner can find out from hearsay and experience what the
- type (c) consequences of his/her actions are, and construct `clean' sets of
- operators that have few such consequences. The general intractability of
- planning means that real plans have to be quite short, so that experienced
- and competent people would have large libraries of operators, which might
- be individual quite complicated (like the tricks that people make up to do
- the last layer of the Rubik's cube).
-
- In terms of everyday live, it strikes me that errors are very often made
- when a side-effect of an operator switches from type a) to type c).
- E.g., when a family with two drivers is reduced from having two cars to
- only one, they are constantly making plans which presuppose that another
- car will be available when one has been driven off somewhere. It took
- My wife and I a tremendously long time to stop making up these kinds of
- errors after we had the use of a friend's car for six months (tho we did
- usually manage to spot our mistake before anyone actually drive off to
- somewhere).
-
- And, as Penni has been saying to me recently, a lot of the info about
- useful operations doesn't have to be figured out at all - you can pick it
- up by watching and listening, because it's just floating around in the
- culture.
-
- Avery.Andrews@anu.edu.au
-