home *** CD-ROM | disk | FTP | other *** search
- Short: Recursively get WWW documents to local disk
- Author: markie@dds.nl
- Uploader: markie@dds.nl
- Type: comm/tcp
-
- AbGetUrl can be used to download WWW documents to local disk for
- offline browsing or archival purposes. This version uses
- multitasking to get x (default 8) documents at once. When the
- downloading is done the documents can be converted to be
- relative to the local disk.
-
- USAGE: ABGETURL URL TO/K,DEPTH/N,LINKS/N,LOCAL/S,NOFIX/S,HEADER/S,
- GET/S,NOINDEX/S,RELOAD/S,RELOADHTML/S
-
- TO <directory> Destination directory
- DEPTH <value> Recursion depth
- LINKS <value> Number of links to open simulataneously
- LOCAL Stay on local site
- NOFIX Do not fix links in urls
- HEADER Save http-headers to disk
- GET Just get one url to file (Why is this ?)
- NOINDEX Do not create index file
- RELOAD Reload all docs
- RELOADHTML Reload all html docs
-
- This program is giftware; if you like it and use it you should
- send the author a gift. ( ZIP drive / JAZZ drive / 060 / etc. ;-)
-
- Disclaimer: use this program at your own risk.
-
-
- ============================= Archive contents =============================
-
- Original Packed Ratio Date Time Name
- -------- ------- ----- --------- -------- -------------
- 31728 17026 46.3% 29-Dec-96 01:18:54 AbGetUrl
- 1418 1004 29.1% 18-Dec-96 13:04:24 AbGetUrl.info
- 1188 608 48.8% 29-Jan-97 19:56:22 AbGetUrl.readme
- -------- ------- ----- --------- --------
- 34334 18638 45.7% 30-Jan-97 18:26:54 3 files
-