home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!mcsun!uknet!mucs!lucs!hes.csc.liv!bruce
- From: bruce@supr.scm.liv.ac.uk (Bruce Stephens)
- Newsgroups: comp.text.tex
- Subject: Re: Big TeX's -- how hard are they to come by?
- Message-ID: <BRUCE.92Sep14115253@suprenum.supr.scm.liv.ac.uk>
- Date: 14 Sep 92 10:53:48 GMT
- References: <BuEC6r.B47@news.cso.uiuc.edu> <18p7lpINNc73@almaak.usc.edu>
- <ROLFL.92Sep11084644@karl.uio.no>
- Sender: news@csc.liv.ac.uk (News Eater)
- Organization: Centre for Mathematical Software Research, Univ. Liverpool, UK
- Lines: 25
- Nntp-Posting-Host: supr.scm.liv.ac.uk
-
- >>>>> On 11 Sep 92 07:46:44 GMT, rolfl@karl.uio.no (Rolf Lindgren) said:
-
- > When Knuth wrote TeX, he knew what he was doing. This is a fundamental
- > assumption, and all arguments pro or con should be checked against this.
-
- Agreed.
-
- > Now, Web, I suppose, supports dynamic allocation of memory through
- > Pascal's new() operator, whose behavior in extreme cases is unspecified
- > and the execution speed of which is compiler dependent.
-
- It's the `behaviour in extreme cases' (i.e., if there's no memory
- left) and the fact that in Pascal you can't allocate an array of
- dynamically determined dimensions.
-
- Essentially it's the lack of pointer arithmetic, I guess. Speed has
- absolutely nothing to do with it (the speed of *everything* is
- compiler dependent), except, I guess, that he could have
- done everything in terms of dynamically allocated linked lists rather
- than arrays, which would obviously have been slower and definitely a
- Bad Thing in terms of clarity.
-
- --
- Bruce Stephens. Centre for Mathematical Software Research, Liverpool Univ.
- Internet: bruce@uxb.liv.ac.uk JANET: bruce@uk.ac.liv.uxb
-