Up One Level



Also known as a Web Crawler, but sometimes described as an Agent, or a Bot. In essence a Crawler is a highly specialised search engine, designed to 'crawl' around the World Wide Web looking for particular pieces of information, addresses, references, etc., while the user is off-line, i.e. not connected to the Internet, and therefore not running up connection charges. The Crawler will search the Internet 24 hours a day, until the next time its user logs on, when the results/information obtained so far will be transmitted to the user, and the Crawler will continue.

Although not necessarily benign, Crawlers are not usually malevolent - merely seeking information rather than actively damaging systems - although the information concerned may be sensitive, classified, or confidential.

*** The Information Security Glossary ***
Previous PageTop of this pageNext Page

Buy Now:


This Glossary forms part of the RUsecure Security Policy Suite... visit RUsecure Security Policy World
Use of the guidance contained within RUsecure™ is subject to the End User Licence Agreement
This site created with EasyHTMLHelp(tm) for MS Word
 Risk Associates: Resources for Security Risk Analysis, ISO 17799 / BS7799, Security Policies and Security Audit