home *** CD-ROM | disk | FTP | other *** search
- Comments: Gated by NETNEWS@AUVM.AMERICAN.EDU
- Path: sparky!uunet!europa.asd.contel.com!paladin.american.edu!auvm!SJSUVM1.BITNET!KILSER
- Message-ID: <SAS-L%92072420092904@UGA.CC.UGA.EDU>
- Newsgroups: bit.listserv.sas-l
- Date: Fri, 24 Jul 1992 17:03:09 PDT
- Reply-To: Max Nelson-Kilger <KILSER@SJSUVM1.BITNET>
- Sender: "SAS(r) Discussion" <SAS-L@UGA.BITNET>
- From: Max Nelson-Kilger <KILSER@SJSUVM1.BITNET>
- Subject: SAS on SunOS - possible bug with large data sets.......
- Lines: 11
-
- When reading in raw data to SAS where there are a large number of
- records per case (100+) we get a data segmentation error and thejob
- aborts. We've tried -memsize 16M, tried increasing size of buffers,
- number of buffers, etc. Still get the same error. Running on SunOS
- 4.1.2 and 4.1.0, SAS 6.07, 40MB memory. Looking at memory usage
- it appears that SAS isn't getting enough memory to do the job.
- It generally shows using about 1MB of memory. Our config.sas
- has max memory set to 40MB - so that isn't it. Removing some
- of the format statements sometimes does the trick and it doesn't
- bomb. Does anybody have a clue or have they run into this too?
- Max Nelson-Kilger
-