home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!olivea!sgigate!sgiblab!troi!steve
- From: steve@dbaccess.com (Steve Suttles)
- Newsgroups: comp.os.vms
- Subject: Re: Editing a large file
- Message-ID: <152@mccoy.dbaccess.com>
- Date: 22 Jan 93 21:13:15 GMT
- References: <1993Jan22.060745.7218@news.acns.nwu.edu>
- Organization: Cross Access Corp., Santa Clara, CA
- Lines: 20
- X-Newsreader: Tin 1.1 PL4
-
- jweiss@casbah.acns.nwu.edu (Jerry Weiss) writes:
- : In article <21JAN199314094029@ariel.lerc.nasa.gov> uugblum@ariel.lerc.nasa.gov (VMScluster Administrator) writes:
- : >Some of our Cray programs produce very large (~80,000 blocks) ASCII output
- : >files which are sent to the VAXcluster for review. TPU and EDT don't work
- : >well with large files. I think the editors read the entire file into memory
- : >so the performance is poor if it works at all. One work-around could be to
- : >break the large file into several smaller files. Does anyone know of a DECUS
- : >utility which can do this? Does anyone else have another suggestion?
- : >
- :
- : Real (old) programers use TECO.
- :
- This remark is entirely true, and mostly flattering. I therefore will overlook
- the part pointing out the chronological facts of life. Thank you.
-
- --
- Steve Suttles Internet: steve@dbaccess.com Dr. DCL is IN!
- CROSS ACCESS Corporation UUCP: {uunet,sgiblab}!troi!steve Yo speako TECO!
- 2900 Gordon Ave, Suite 100 fax: (408) 735-0328 Talk data to me!
- Santa Clara, CA 95051-0718 vox: (408) 735-7545 HA! It's under 4 lines NOW!
-