home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.unix.bsd
- Path: sparky!uunet!usc!sol.ctr.columbia.edu!ira.uka.de!News.BelWue.DE!news.uni-tuebingen.de!mailserv!zxmsd01
- From: zxmsd01@mailserv.zdv.uni-tuebingen.de (Gunther Schadow)
- Subject: Re: tar 'multivolume' bugs and workaround workound
- Message-ID: <zxmsd01.715349289@mailserv>
- Sender: news@softserv.zdv.uni-tuebingen.de (News Operator)
- Organization: Comp. Center (ZDV) U of Tuebingen, FRG
- References: <1992Aug26.041547.26517@ntuix.ntu.ac.sg> <Btn5u7.qD@acwbust.sub.org>
- Date: Tue, 1 Sep 1992 12:08:09 GMT
- Lines: 58
-
- I think it is most desirable to have a tar with a kind of -z option
- (say -Z) which does compress each file *before* dumping it into the
- archive. This method would be superior against just compressing the
- output stream of tar, what the -z option actually does. Yet it is
- harder to realize.
- The -z option can easily be simulated by a pipe: "tar -cf - whatever
- |compress >medium". -z and -M wont work together anyway in the current
- GNU tar. But the -Z option I am thinking of, is not realizable with a
- simple shell scipt. You cannot just say something like "tar -cf medium
- (compress whatever)".
- To compress every file on your disk before archiving it is no
- solution, because (1) it takes time, (2) you have to uncompress
- afterwards and (3) the access time will be modified, so you lose
- information.
- I think it is *urgently* necessary to invent such a -Z feature into
- GNU tar. The reason why this should be done is at hand: Say, you have
- a backup of 30 disks. If this is one huge compressed tar archive, and
- you loose one disk for whatever reason, you can forget about getting a
- single correct file back from the disks beyond that deficient disk,
- because uncompress gets completely out of sync. Why should you make
- any backup then, if you cannot rely on it in case of emergency? It is
- far more possible, that there is one out of 30 disks damaged -- what
- about number 3 :-( -- than your 120 MByte HardDisk will crash over
- years.
- GNU tar is abel to skip over damaged records and can even start
- untarring from any point within the tar file, but only tar is, not
- uncompress. This feature of GNU tar is also useful if you don't want
- to spend half an hour playing disk jockey while uncompress|tar skips
- over 25 disks until it extracts the 20kByte file you want in a few
- seconds. Or even worse you made a type mistake in the file name you
- give to tar, or you forgot the actual path to it: Again 30 minutes of
- your live...
- Is there anybody who backed up his 240MBytes disk in 240 floppy disk
- with the normal dump? Or made an extremely unreliable .tar.Z backup
- onto approx 180 disks? I didn't --- and I cross my fingers :-). The
- only backup I have is the Network; not that I had the facility to dump
- to an NFS, but I publish almost every good changes I made to 386BSD
- files onto agate.berkeley.edu.
- Nice to have a tape, but can you afford to pay $50 for a 40MByte
- small cardridge of tape, if you need 6 of them for your level-0 dump?
- BTW: Has anybody tried to use a video taperecorder to dump his HD onto
- a cheap $2.5 videotape? This, together with a tar -Z facility would be
- the most convenient, cheap, secure, just effective way of archiving
- backup's or etc01 distributions.
- These are just a few of my thoughts about backups, archives, and
- tar. I tried once to get into the tar sources to invent my -Z option,
- but I failed due to the complexity of tar's method, and mainly due to
- my lack of time. If there is anybody familiar with tar code, please
- think about my -Z ideas, and help GNU tar to become the best common
- archiving program!
-
- regards
- -Gunther
- --
- -------------------------------------------------------------------------------
- Gunther Schadow, e-mail: Gunther@mailserv.ZDV.Uni-Tuebingen.DE
- Sudetenstrasse 25, Phone: (49) 7071/37527
- 7400 Tuebingen, Germany.__________Stop__________Horn Please!__________O.K. TATA
-