home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.text.tex
- Path: sparky!uunet!mcsun!ieunet!tcdcs!maths.tcd.ie!tim
- From: tim@maths.tcd.ie (Timothy Murphy)
- Subject: Re: Big TeX's -- how hard are they to come by?
- Message-ID: <1992Sep11.230429.24578@maths.tcd.ie>
- Organization: Dept. of Maths, Trinity College, Dublin, Ireland.
- References: <BuEC6r.B47@news.cso.uiuc.edu> <18p7lpINNc73@almaak.usc.edu> <ROLFL.92Sep11084644@karl.uio.no>
- Date: Fri, 11 Sep 1992 23:04:29 GMT
- Lines: 37
-
- rolfl@karl.uio.no (Rolf Lindgren) writes:
-
-
- >Now, Web, I suppose, supports dynamic allocation of memory through
- >Pascal's new() operator, whose behavior in extreme cases is unspecified
- >and the execution speed of which is compiler dependent.
-
- >A similar problem is that if you use dynamic allocation of memory, you
- >can't compare the speed of different implementations of a given algorithm,
- >partly because the speed of different types of pointer allocation is
- >compiler dependent and not easily computed. To be able to rank
- >implementations of algorithms according to their speed is crucial if a
- >program does a lot of operations on its data structure, as TeX does.
-
- Sorry, /dev/null is full.
- This is really nonsense.
- There was difficulty with dynamic allocation in Pascal,
- but there is absolutely no reason why it should not be introduced
- into TeX in C.
- In fact Karl Berry (in charge of UnixTeX distribution)
- has said that he is working on this.
-
- However, I doubt if it would be possible to go
- from small to large TeX dynamically,
- as the size of the elements in the large mem array
- (and other arrays)
- differs in the 2 cases.
- For this reason,
- it would still be necessary to have small and large TeX,
- even if the latter was dynamically extensible.
-
-
- --
- Timothy Murphy
- e-mail: tim@maths.tcd.ie
- tel: +353-1-2842366
- s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
-