home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.os.vms
- Path: sparky!uunet!charon.amdahl.com!pacbell.com!ames!elroy.jpl.nasa.gov!swrinde!sdd.hp.com!ux1.cso.uiuc.edu!news.cso.uiuc.edu!sb3mac.inhs.uiuc.edu!r-larkin
- From: Ronald P. Larkin <r-larkin@uiuc.edu>
- Subject: Re: Editing a large file
- References: <21JAN199314094029@ariel.lerc.nasa.gov> <1jor9sINN32k@gap.caltech.edu> <1993Jan22.060745.7218@news.acns.nwu.edu>
- Message-ID: <C1IzsD.Jx5@news.cso.uiuc.edu>
- X-Xxdate: Wed, 27 Jan 93 12:41:19 GMT
- Sender: usenet@news.cso.uiuc.edu (Net Noise owner)
- X-Useragent: Nuntius v1.1.1d16
- Organization: Illinois Natural History Survey
- Date: Wed, 27 Jan 1993 18:38:36 GMT
- X-Xxmessage-Id: <A78C336FCE019D21@sb3mac.inhs.uiuc.edu>
- Lines: 25
-
- In article <21JAN199314094029@ariel.lerc.nasa.gov> VMScluster
- Administrator, uugblum@ariel.lerc.nasa.gov writes:
-
- > Some of our Cray programs produce very large (~80,000 blocks) ASCII
- output
- > files which are sent to the VAXcluster for review. TPU and EDT don't
- work
- > well with large files. I think the editors read the entire file into
- memory
- > so the performance is poor if it works at all...
-
- Somebody should point out that there is a general principle that a
- program that
- produces very large quantities of printout could probably be written
- better, and
- that principle may apply to this case of very large quantities of text
- meant to be
- perused by humans. In this case, rather than repeating supercomputer
- runs, it
- may be possible to write a postprocessor to select that part of the 80k
- blocks that
- might be of interest to humans. Maybe even $ SEARCH would help....
- Ronald P. Larkin, Illinois Natural History Survey
- 607 E. Peabody Drive, Champaign, IL 61820 USA
- r-larkin@uiuc.edu (Ron Larkin)
-