After two years of radio-silence, HREF Tools Corp. is back with important news for anyone running web sites on Microsoft IIS. We are marketing a new product -- named StreamCatcher -- which offers significant improvement in the stability, responsiveness and findability of your web sites. Furthermore, it eliminates the unfair load placed on your site by web robots and code red attackers.
Here's a spontaneous comment from one of the beta-testers:
"I have added about a dozen domains to StreamCatcher without any problems at all. I really must say that StreamCatcher is one GREAT application."What is StreamCatcher?
What can StreamCatcher do?
StreamCatcher provides the following optional functionality for both HTTP and HTTPS incoming requests on unlimited, independently configurable WebSites per server.
* It offers Filtering and Remapping of incoming URLs using optional wildcards. Its rules are fully configurable.
* It allows conditional responses to specific URLs.
* It provides analysis and daily reporting on User Agents so that you can easily determine which web robots are frequently hitting your web site.
* It detects web robots, and offers specific URL remapping for these robots, along with daily reporting of their activity.
If you want, you can force all web robots to view one and only one page within your domain, giving them just the information you want indexed in the search engines and no more. Alternatively, you can control web robot access to each directory and/or file and/or web application on your domain.
* It offers complete HTTP request analysis for web application developers and general education.
* It provides daily monitoring of bytes transferred in and out via HTTP for each defined WebSite.
* It can "play dead" by mapping certain incoming requests to "no response."
This feature shows great promise in minimizing system invasions and preventing web server error pages being returned as a result of Code Red worm attacks. Whenever StreamCatcher "plays dead" the request never reaches the web server main processor and the request is omitted from the web server software log file. StreamCatcher optionally logs all such requests with only essential information.
The "play dead" capability can be used for any incoming request pattern, not just the Code Red worm.
StreamCatcher has the following key features:
* A remote administration interface (via any browser) for configuration, testing and reporting;
* Independent and secure remote consoles for each individual WebSite administrator;
* A highly flexible configuration paradigm with text-based configuration files in arbitrary directory locations and no Windows registry entries.
* Live Refresh: once loaded, all configuration settings can be altered and reloaded remotely without restarting the ISAPI web server software;
* High Security: All user-names and passwords for remote control plus machine-specific paths are in a separate security configuration file that is not accessible through the remote console;
* Real-time Trace Logging of Incoming URL requests arriving at all or an individual WebSite (remotely accessible);
* A Testing Facility for determining how one (or hundreds) of incoming URLs will be remapped, with logic detailed and discrepancies from expected results highlighted.
* Extensive use of color in test mode and trace log reports; colors are fully configurable.
* Special handling of URL re-mapping for WebHub users;
* Extensive Status and Error reporting;
* All functionality provided by a single DLL;
* Compatible with Microsoft FrontPage and Microsoft ASP.
StreamCatcher is compatible with all versions of WebHub.
When used with WebHub, StreamCatcher offers a number of additional benefits:
Reduced bandwidth and reclaimed CPU cycles formerly wasted on Web robots
StreamCatcher offers the ability to control web robot requests such that only a single session number is used (instead of spawning multiple sessions). This means that the database built by the web robot will contain only cleaner, more meaningful links. To estimate the meaning of "cleaner and more meaningful" in this context, compare the ordinary situation where a web robot is able to request a new dynamic page thousands of times, coming up with a new session number each time, and then following that link to multiple pages within the dynamic site, resulting in a limitless number of unique yet largely redundant page requests to the situation where StreamCatcher identifies the web robot and forces it to use a designated session number, resulting in a finite number of links within the web site.
Reducing the number of entries in a web robot's database means that web robot traffic is reduced dramatically. This means less bandwidth is wasted on essentially redundant traffic, and less CPU time is wasted calculating results for non-human visitors.
Remember, web robots regularly revisit your site with all the existing links in their database to test whether the URLs are still valid. You can see how StreamCatcher avoids the exponential growth in unnecessary web robot traffic.
Increased Web site response times
By reducing whatever overhead (within the Hub and within the application) is associated with creating a new session, StreamCatcher gives a performance boost to the web site. Considering that web robots tend to visit a web site intensely when they do visit (requesting all their pages in parallel from multiple clients until their database is exhausted), avoiding the overhead of new session creation during that phase of activity is very significant.
Simplified search engine links (if web robot detection is enabled)
StreamCatcher is compatible with all versions of WebHub. If web robot detection is enabled and the web robot session number is declared to be, for example, 1001, then links will be in this form:
Keeping Session Number Blank for Web Robots (requires WebHub v2.009+)
In an ideal world, web robots would see your dynamic WebHub site as if it were static. This is accomplished through cooperation between StreamCatcher and WebHub v2.009+.
StreamCatcher detects a web robot request and remaps it using the designated robot session number.
Example: www.streamcatcher.com/AppID:PageID becomes
The WebHub application then detects the robot session number 1001. When JUMP, GO, HIDE, etc. macros are used, WebHub generates links using short /AppID:PageID syntax (and no session number).
Example: (~JUMP|PageID2~) generates <a href="/AppID:PageID2">PageID2</a>When the web robot comes back to follow these links, StreamCatcher again remaps it using session number 1001. Under this technique, all web robots share the same session.
The normal session number functionality in all WebHub macros is preserved for humans.
To implement this in your WebHub application, you only need to add two application-level defaults to your INI file (explained in detail in the StreamCatcher User Guide).
Note: WebHub v2.009 shipped on 25-January-2002.
Ability to Put an Application "On Hold" in Real Time
For those times when a developer needs to make changes to an application running on a production server, and wants to display a temporary notice to surfers and still be able to test the application# StreamCatcher has an answer. Each AppID may be independently indentified as being unavailable to all users, or to everyone except someone using a given session number.
The Bottom Line
A StreamCatcher license costs $179 per web server machine, supporting unlimited domains on that machine. http://href.com/hrefshop:detail::0250
FREE Trial on Localhost
Download from http://www.streamcatcher.com