Features:
- Multithreaded webspider
- The program resumes broken downloads for HTTP and FTP-via-HTTP connections
(direct FTP resume will be added in subsequent versions).
- HTTP and FTP-via-HTTP proxy servers are supported.
- You can import favorites from Internet Explorer.
- The program has a built-in ping utility to detect
if Internet connection is available.
The program will stop download if the Internet connection is lost and will
resume automatically once it becomes available again.
- You can define such
connection options as maximum number of concurrent connections, maximum number of
concurrent connections per server, and pause between connections to the same server.
This prevents the servers from overloading while maintaining high speed
downloads when you download from multiple servers.
- Global filters
are available in addition to task filters. For instance,
you can define a global filter to exclude banners. Global filters are applied
to every task in the project file.
- You can search in downloaded files by keywords, phrases, complex boolean expressions,
URLs, webpage titles, file size ranges, file types, file modification dates, and
download dates.
- The program uses standard MS Access database as a project file. As this is a very widely
used database format, this gives you direct access to the project data. You can also
access the database programmatically.
- You can duplicate websites on disk including directory structure.
- The program extensively supports drag-and-drop. You can drag and drop
links, files, and HTML selections to the Tasks Window or to the Drop Box.
You can also drag and drop tasks and resource files to file system or between
two projects.
How this program differs from other offline browsers
- The most distinctive feature of this program is that it can
download virtually everything that you can view with a standard browser,
be it Flash, JavaScript links (no matter how complex the scripts are),
content available through web forms, HTTPS websites, or FTP sites.
If the program failed to download a task or
a number of pages of a task, open the task with the built-in browser and
browse to the failed webpage. The
page will be retrieved with the help of the browser and the download
will continue. In subsequent versions of the program you will have an
option to do it automatically.
- Only this program provides security against viruses and malicious
scripts you can download along with the webpages. When you browse the
offline content with the built-in browser you can be sure that you have
the same security as when you browse the Internet with the standard browser.
Other offline utilities store downloaded websites on your local hard
drive as linked files or run a webserver on your PC.
When you browse your local hard drive the browser turns off
security as it assumes you browse trusted content. This program notifies
the browser that the pages you browse must be treated the same way
as pages from the Internet.
- This program gives you a true ability to download only the pages
you need and nothing else. And you do not have to use filters for this purpose (though the
program provides you with very powerful automated
filtering tools).
Do not set levels and filters on a task. Just open a page with the built-in
browser and drag-and-drop the links you want to retrieve to a special topmost
window called Drop Box. The program will download these links.
You can also drag and drop web page text selections.
Every link in the selected text fragment will be downloaded.
- You can browse tasks with the built-in browser while the program downloads
them. Retrieved pages will be loaded immediately. If you click on a link
that is not yet retrieved, it will be scheduled
for immediate download.
- You can search the downloaded websites and pages for a particular
word, phrase, or a combination of them. The program gives you very
powerful search capabilities.
This is like having a search engine to search only the offline content.
- No matter how, when, and in what order you download
webpages, and regardless of what tasks they belong to, if they link to each other
you will be able to click on the link and get to the linked page even though
it might have been retrieved by another task. This gives you the ability to
download different parts of a site by different tasks with different parameters
and still have the parts linked to each other as if they were downloaded by
a single task; or the ability to have shortcuts in the
Tasks List to any
page of a website.