home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
OS/2 Shareware BBS: 35 Internet
/
35-Internet.zip
/
ckurl162.zip
/
WHATSNEW.TXT
< prev
Wrap
Text File
|
2000-04-25
|
4KB
|
78 lines
CheckUrl 1.6.2 what's new
M modified
+ new
- deleted
1.6.2 + Added the /log <filename> and /report <filename> command line
options which allow to override the related settings (logfile
and htmllogfile) in checkurl.cfg.
M Fixed a bug in the command line parser
M Removed the /c parameter for detach when using multiple connections:
some users reported it was broken on their system
M Corrected a bug which caused URls on the same level not to be
parsed correctly
1.6 + Added the ability of analyzing on-line urls without downloading:
checkurl downloads the html page specified, analyzes it, and
checks the urls it contains.
+ Checkurl now finds not only absolute http urls (http://...),
but also relative ones, such as "../index.html" or "/index.html".
This works only when parsing on-line urls.
M Changed command line parameters, launch checkurl without any
argument and you'll get the help
+ Added "forbidden" code recognition
M Some code restyling inside the source
M Changed my email address: new address is fc76@softhome.net
M Corrected href fields parsing
M Corrected url redirection parsing when the redirection url
doesn't include the base site (es.: /newurl).
1.5 M Corrected a mistake about vars.!file.!log which caused the file
"vars.!file.!log" to be created
M Corrected html file parsing (Gophers link weren't skipped,local
files were parsed as external urls and trailing slashes were
stripped)
+ Configuration is contained now in the file "checkurl.cfg"
+ Added multi connection support: CheckUrl can now check more than
one url at time by launching child processes and assigning them
the urls to check. This can improve speed up to x times depending
on the number of processes you choose to use and on the speed your
connection. Port 1932 is used for the inter communication processes
You can use multiple connections by specifing /mconn in the
command line (you must have the address 'localhost' in your "host"
file - see doc for details)
+ Added html report "report.html"
+ Added bad url logging to badurlfile (see cfg). This can be useful
to check more times the same url to be sure they are bad.
M Corrected url translation for some character
+ Added multi pass mode (check x times an url)
+ Better error reporting, both receiving data and connecting
+ Added configurable timeout when talking with remote servers
1.4 + Added the capability to check directly HTML files, such as NetScape
bookmark file. This should be really useful!
During HTML file parsing a url dupe checking is performed too.
+ Better error handling (some still to do)
+ Better output:
now there are three categories of messages
- ok - the url exists
- warning - the url is relocated
- error - url not found or other error
+ Log to file - possibility to log only warning and errors
M Changed the method to check urls: now CheckUrl fetchs the first bytes
of an url as if it was a browser (for http urls), because some www
servers don't support the HEAD request (incredible, but true!)
M Commented "call SockDropFuncs" and "call FtpDropFuncs", because such
dlls could be used by other running programs.
M Better handling of "Moved temporarily" and "Moved Permanently"
return messages
1.0 M Corrected the bug in version 1.0b
1.0b1 Released the first public version of Checkurl. It showed some bugs when
checking from some types of http servers.