home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
OS/2 Shareware BBS: 35 Internet
/
35-Internet.zip
/
url_util.zip
/
downurl.cmd
next >
Wrap
OS/2 REXX Batch file
|
1997-07-19
|
1KB
|
24 lines
/*
* DOWNURL.CMD by pardoz@io.com, July 16, 1997
*/
/* Edit the value of "myfile" to point to the file where you want URLs
* to be downloaded written to. This script will put each URL on a
* separate line, resulting in a file that can be fed into sslurp or
* any other Web page grabbing tool that can be invoked from the
* command line and accepts a file of URLs to fetch as a parameter.
* I call sslurp from the .CMD file I use to download mail and
* fetch new news with slrnpull.
* sslurp is a freeware Web grabber that can be found at
* ftp://hobbes.nmsu.edu/pub/os2/apps/internet/www/util/sslurpN.zip.
* where N is the latest version number (12 at the time of this writing).
* Installation: put geturl.cmd in your slrn directory and add the following
* line to your slrn.rc:
* set non_Xbrowser "echo %s | downurl.cmd > nul"
* or call downurl using the picker S-Lang macro in this zipfile.
*/
parse pull foo
myfile = 'e:\geturls'
url = foo
n=LINEOUT(myfile,url)