home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!usc!news.service.uci.edu!unogate!mvb.saic.com!carpenter
- From: Carpenter@Fwva.Saic.Com (Apprentice Wizard)
- Newsgroups: comp.os.vms
- Subject: Re: Editing a large file
- Message-ID: <10242009@MVB.SAIC.COM>
- Date: Thu, 21 Jan 1993 19:46:14 GMT
- References: <21JAN199314094029@ariel.lerc.nasa.gov>
- Organization: Science Applications Int'l, Corp. - Computer systems operation
- Lines: 35
-
- uugblum@ariel.lerc.nasa.gov (VMScluster Administrator) writes:
- >Some of our Cray programs produce very large (~80,000 blocks) ASCII output
- >files which are sent to the VAXcluster for review. TPU and EDT don't work
- >well with large files. I think the editors read the entire file into memory
- >so the performance is poor if it works at all. One work-around could be to
- >break the large file into several smaller files. Does anyone know of a DECUS
- >utility which can do this? Does anyone else have another suggestion?
- >
-
- A quick workaround might be:
-
- $x=1
- $open/read file file.dat
- $write:
- $open/write file'x' file'x'.dat
- $count=1
- $loop:
- $read file line
- $write file1 line
- $count=count+1
- $if count=1001
- $then close file'x'
- $x=x+1
- $goto write
- $endif
- $goto loop
-
- *** I just wrote this off the fly without testing it ***
- =-=-=-=-=-=-==-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=
- Scott Carpenter
- VAX Systems Manager Ya dina' tell 'im how long it'd really
- SAIC Falls Church, VA take ta fix it did ya'?
- CARPENTER@FWVA.SAIC.COM M. Scott, CAPT, SUFP
- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
-
-