home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.os.vms
- Path: sparky!uunet!cs.utexas.edu!zaphod.mps.ohio-state.edu!news.acns.nwu.edu!casbah.acns.nwu.edu!jweiss
- From: jweiss@casbah.acns.nwu.edu (Jerry Weiss)
- Subject: Re: Editing a large file
- Message-ID: <1993Jan22.060745.7218@news.acns.nwu.edu>
- Sender: usenet@news.acns.nwu.edu (Usenet on news.acns)
- Nntp-Posting-Host: unseen1.acns.nwu.edu
- Organization: Northwestern University, Evanston Illinois.
- References: <21JAN199314094029@ariel.lerc.nasa.gov>
- Date: Fri, 22 Jan 1993 06:07:45 GMT
- Lines: 15
-
- In article <21JAN199314094029@ariel.lerc.nasa.gov> uugblum@ariel.lerc.nasa.gov (VMScluster Administrator) writes:
- >Some of our Cray programs produce very large (~80,000 blocks) ASCII output
- >files which are sent to the VAXcluster for review. TPU and EDT don't work
- >well with large files. I think the editors read the entire file into memory
- >so the performance is poor if it works at all. One work-around could be to
- >break the large file into several smaller files. Does anyone know of a DECUS
- >utility which can do this? Does anyone else have another suggestion?
- >
-
- Real (old) programers use TECO.
-
-
- --
- Jerry S. Weiss "If you can't stand the heat, stay out of the antimatter!"
- j-weiss@nwu.edu Dept. Medicine, Northwestern Univ. Medical School
-