home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: sci.math.symbolic
- Path: sparky!uunet!think.com!mintaka.lcs.mit.edu!zurich.ai.mit.edu!jaffer
- From: jaffer@zurich.ai.mit.edu (Aubrey Jaffer)
- Subject: Differentiation
- Message-ID: <JAFFER.92Nov15195716@camelot.ai.mit.edu>
- Sender: news@mintaka.lcs.mit.edu
- Organization: M.I.T. Artificial Intelligence Lab.
- Date: Mon, 16 Nov 1992 00:57:16 GMT
- Lines: 24
-
- This paper is from the Numerical Optimisation Centre of Hatfield
- Polytechnic, ENGLAND. Principal author is L.C.W. Dixon. Premise is:
-
- "In most optimisation algorithms it is assumed that it is possible
- to calculate both the objective function F(x), x [in] R^n and its
- gradient vector at any requested value of x. Second order methods are
- well known that could utilise the Hessian matrix DEL^2F(x) if this
- could be computed efficiently, however traditionally such algorithms
- have not been favoured for problems with n>5 because of the difficulty
- of analytically coding the expressionf for DEL^2F for complicated
- industrial problems.
- "In a typical industrial problem the code necessary to compute F(x),
- given x, may well consist of 200 lines of FORTRAN and involve 1000
- arithmetic steps; even if n is realatively small (say n=20).
- "The task of deriving formula for the n values of the gradient for
- such an objective function is daunting and may well take man-days of
- effort... Most symoblic differentiation codes such as that included
- in REDUCE and MACSYMA are defeated by the length of such a task
- Griewank [1988]."
-
- Has some reader of this newsgroup worked on such a problem? I don't
- think of differentiation as a hard problem. It seems to me that the
- derivative of such a problem will be large no matter how it is arrived
- at. What am I missing?
-