home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.unix.admin
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!usc!cs.utexas.edu!asuvax!elam.mdl.sandia.gov!cs.sandia.gov!wllarso
- From: wllarso@sandia.gov (William L Larson)
- Subject: Re: How do you back up a terabyte?
- Message-ID: <1992Dec16.162326.24324@cs.sandia.gov>
- Sender: usenet@cs.sandia.gov (Another name for news)
- Organization: Sandia National Laboratories
- References: <ericw.724459057@hobbes>
- Date: Wed, 16 Dec 92 16:23:26 GMT
- Lines: 18
-
- In article <ericw.724459057@hobbes> ericw@hobbes.amd.com (Eric Wedaa) writes:
- >And does anyone remember the real reason why dump/tar/cpio is not
- >a good thing on an acitve filesystem?
-
- There was a presentation at the USENIX-LISA V (1991) on this issue, I
- have a copy from the proceedings. Look at "Issues in On-line Backup" by
- Steve Shumway - SunSoft, Inc. The problem with on-line backups is the
- possibility of modification of the filesystem structure during backups
- (to hopefully paraphrase Steve's article). If the structure is modified
- while the backup is in process, then the structure cannot be preserved.
- For example, dump uses a two pass procedure, first find all of the files
- and directories, and then second copy the files and directories. What
- would happen if a directory were added/deleted/moved after pass one but
- before pass two? Steve did an excellent analysis of the possibilities.
- Similar possibilities also occur when using tar and cpio.
-
- Steve, if you are out there, are copies of your analysis available
- on-line?
-