
Xbots
Searching by concept
An intelligent Web Crawler and Classifier. Xbots has been employed by numerous govermental agencies around the world to tackle problems related to financial internet crime. Using example-driven crawling, Xbots was built to collect and classify websites based on their content and relevance to the desired concept.

Data extraction
Effortless, Fast, Secure.
Search concepts can be created using various methods, from providing examples of relevant websites to utilizing structured information or leveraging encyclopedic sources like Wikipedia pages. Train concepts like 'hotel' or 'restaurant' to retrieve vast amounts of related websites.

Final touches
Advanced Solutions
Upon completing their searches, spiders sort the websites by concept relevance. Users can examine the results through an intuitive application interface, and extract contact addresses using the WebID plug-in.
Information is extracted and stored in a structured format, such as a database or JSON.

Connection with DataDetective
Turning Insight Into Action
These structured results can also be exported to DataDetective, the powerful data mining environment created by Parabots' partner Sentient Information Systems. DataDetective is the perfect tool for the clustering of content.
