The Internet can be searched in many different ways. Search engines like Google
and Yahoo allow people to look for pages containing one or more words. Specialized
collection sites like Startpagina.nl or Yahoo Directories offer lists of pages
in a certain category.
But what if you want to collect a list of web pages about your own particular topic of interest? Wouldn't it be convenient if you could learn a computer the concept of what you are looking for in a few simple steps, and let the computer do the searching for you?
ParaBotS develops intelligent web spiders and classifiers that find information fulfilling specific user demands. These demands are translated to search concepts that are used for the selection of relevant sites and for the determination of interesting search paths.
A user can create search concepts in a wide variety of ways. He can give some examples of relevant web sites, but can also give some examples by means of structured information (about cultural objects) or encyclopaedic information like a Wikipedia page. Concepts like 'hotel' or 'restaurant' can be used to retrieve restaurant and hotel sites, or the addresses found on these sites.
Our intelligent spiders crawl over the internet and search for relevant pages and sites. This information from these sites is structured and stored in a database, or in other formats like XML or HTML.
The XBotS software suite is a powerful environment for concept-driven web searches and a good example of the capabilities of intelligent web collection. Currently it is used by European and North American government agencies.