home *** CD-ROM | disk | FTP | other *** search
- # Look for duplicate files anywhere in a directory tree.
-
- # Copyright (c) 1990 by Hamilton Laboratories. All rights reserved.
-
- # It works by first constructing the list of all the path names of every-
- # thing in the tree using the -r (recursive) option of ls. The `...` part
- # is command substitution to paste that result into the foreach loop. The
- # :gt operator means globally edit the list to trim each pathname down to
- # just the tail part; e.g., given "x\y\z.c", the tail is just "z.c".
- # (There are other pathname editing operators for grabbing just the directory
- # containing, everything except the extension, the fully-qualified name for
- # a relative pathname, etc.)
- #
- # The foreach loop writes each name out to the pipe, one per line. (I've used
- # a calc statement to do the writing but you could also use an "echo $i".)
- # The sort obviously sorts all the lines alphabetically and uniq -d
- # command gives just the duplicates.
- #
- # The whole operation takes about 2 minutes to search an entire typical
- # (very full) 100 MB HPFS partition.
-
- proc duplicat(startdir)
- local i
- foreach i (`ls -r $startdir`:gt) calc i; end | sort | uniq -d
- end
-
- duplicat $argv
-