home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.lang.c++
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!ira.uka.de!ira.uka.de!slsvaat!us-es.sel.de!reindorf
- From: reindorf@us-es.sel.de (Charles Reindorf)
- Subject: Re: Using 64k objects
- Message-ID: <1992Nov10.102746.12514@us-es.sel.de>
- Sender: news@us-es.sel.de
- Organization: SEL-Alcatel Line Transmission Systems Dept. US/ES
- References: <1992Nov4.011604.5884@piccolo.cit.cornell.edu> <BxH4p6.8Bn@research.canon.oz.au>
- Date: Tue, 10 Nov 92 10:27:46 GMT
- Lines: 28
-
- In article <BxH4p6.8Bn@research.canon.oz.au>, colin@cole.research.canon.oz.au (Colin Pickup) writes:
- |> In article <1992Nov4.011604.5884@piccolo.cit.cornell.edu>
- |> sl14@crux3.cit.cornell.edu (Stephen Lee) writes:
- |> > [ etc ]
- |>
- |> Borland is one of the few C/C++ compilers that handles arrays bigger that
- |> 64K. To do it either use the huge memory model for the whole program or
- |> just make the pointers to the arrays huge (read the manuals on how to do
- |> this). The second option will give better run-time performance.
-
- Is it not the case that the "huge" memory model does not imply huge pointers by default?
- According to my understanding of the "huge" memory model, the only difference between it and
- large memory models is that a separate data area is allocated for statics, string literals,
- e.t.c. per translation unit rather than for the entire program. The "huge" as applies to
- memory models versus the "huge" as applies to pointers would seem to have a subtle difference
- here. If you wish to allocate and manipulate huge arrays, I believe you have to explicitly
- use "huge" pointers.
-
- |> NOTE : you can not use new to allocate arrays bigger than 64K. The size
- |> parameter for new is a unsigned, i.e. 16 bits. You must use hugealloc and
- |> hugefree (again look in the manual for these functions).
- |> Colin Pickup
- |>
-
-
- All opinions above are my own etc.
-
- -- Charles Reindorf
-