home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!mcsun!uknet!harrier.ukc.ac.uk!owl.ukc.ac.uk!dat
- From: dat@ukc.ac.uk (D.A.Turner)
- Newsgroups: comp.lang.functional
- Subject: Re: algorithmic efficiency and lazy languages
- Message-ID: <502@owl.ukc.ac.uk>
- Date: 24 Jul 92 14:36:27 GMT
- References: <DTARDITI.92Jul21023549@DESERT.FOX.CS.CMU.EDU> <501@owl.ukc.ac.uk> <boehm.711912209@siria>
- Reply-To: dat@ukc.ac.uk (D.A.Turner)
- Distribution: comp
- Organization: Computing Lab, University of Kent at Canterbury, UK.
- Lines: 27
-
- In a previous posting I wrote:
- > There is no special difficulty in
- >reasoning about the *time* complexity of programs written in a
- >non-strict functional language. One can derive from the program
- >equations for functions fi say, similar equations for Tfi, the
- >time-to-execute of function fi.
-
- To which boehm@parc.xerox.com (Hans Boehm) writes:
- >It seems to me that this is insufficient. It basically tells us that it's
- >possible to write accurate profilers.
-
- Sorry, I should have been more explicit.
-
- I meant that you could often *solve* the equations (making enough
- simplifying assumptions, as one always does in asymptotic complexity
- analysis) to arrive at a formula for the asymptotic complexity of each
- function of interest. As mentioned, see chapter 6 of B+W for some
- examples.
-
- -------------------
-
- To add a further note on why space complexity is much less tractable -
- it is RESIDENCY we are interested in, typically, rather than just the
- number of times `cons' gets called, which is clearly related to the
- time analysis, as has already been commented.
-
- David Turner
-