׀  Submit  ׀  About  ׀  Contact  ׀
 
Controlling This Bot
Robots.txt
Meta Tags
htaccess
   
Discuss This Bot
Comment on this Bot
   
   
Bots By Type
Advertising
Bad Bots
Crawlers
Scrapers
   
Bots By User-Agent
A B C D E F
G H I J K L
M N O P Q R
S T U V W X
Y Z 0 1 2 3
4 5 6 7 8 9
   
Newest Bot Added
   
   
 
 
 
 
Home > User-Agent > CatchBot

  CatchBot 
 
Bot Description:

CatchBot is the web crawler for Catch, the online division of Cirrus Media Pty Ltd. Cirrus Media is Australia’s leading and largest business to business publisher and information provider. CatchBot investigates websites for publicly available information about companies, such as a company’s name, address, telephone number and keyword data about a company’s products and services. CatchBot is not designed to access or index any personal information or any information about individuals.
 
Information gathered by CatchBot is stored on a password protected servers and the security of this information is of the highest importance. Information gathered by CatchBot may be used for business activities that are undertaken by Catch. Examples of this include publishing and maintaining business directories in various countries around the world, industry specific websites and online portals
 

CatchBot/2.0; +http://www.catchbot.com
CatchBot/1.0; +http://www.catchbot.com
CatchBot/1.0; http://www.catchbot.com


Have you dealt with this bot before?



Controlling this bot on your site:

Below are some methods to control the access that this bot (title) has to your site and or pages.  The methods below may not work if the bot does not pay attention to the limits you have established. Select a method: Robots.txt, Meta Tags, htaccess,


Using Robots.txt:

A Robots.txt file is placed in the root of your website.  Good bots will first look for and review the robots.txt file before either continuing on to other pages on your site or leaving if they are not allowed.  For more information about robots.txt files, visit robotstxt.org.

Do not allow any bot (user-agent) to access any part of your site
User-agent: *
Disallow: /


Allow any bot (user-agent) to access any part of your site
User-agent: *
Disallow:


Do not allow CatchBot to access any part of your site
User-agent: CatchBot
Disallow: /


Allow CatchBot to access any part of your site
User-agent: CatchBot
Disallow:


Allow CatchBot to access your site, but CatchBot is not allowed to access the "admin" folder
User-agent: CatchBot
Disallow: /admin


Allow CatchBot to access your site, but CatchBot is not allowed to access the "admin" folder and the "photos" folder
User-agent: CatchBot
Disallow: /admin
Disallow: /photos

        


Using Meta Tags:

You can use meta tags in your pages to help control the access bots have to your site.  If you use a template for all your pages, you can add the meta tags in between the
<head> and </head> and it will work on all the pages using that template.  If you want to control specific pages, you can add the meta tags on individual pages in between the <head> and </head> instead.

Allow all bots to access your page(s)
<meta name=”robots” content=”index” />

Allow all bots to access your page(s) and follow links on the pages
<meta name=”robots” content=”index, follow” />

Allow all bots to access your page(s) but do not allow them to follow links
<meta name=”robots” content=”index, nofollow” />

Do not allow any bots to access your page(s)
<meta name=”robots” content=”noindex” />

Allow CatchBot to access your page(s)
<meta name="CatchBot" content="index">

Do not allow CatchBot to access your page(s)
<meta name="CatchBot" content="noindex">

Allow CatchBot to access your page(s) and follow the links to more pages
<meta name="CatchBot" content="index, follow">

        


Using HTACCESS:

You may have the ability to add and or modify an htaccess file on your server.  The htaccess file can be used to control the bots at the server level.  Add the following to the .htaccess file on your server to block specific bots from visiting your site.  Be sure to replace the Enter User Agent with the user-agents for the bots you would like to block simliar to the user-agent (CatchBot) listed in the example below. 

SetEnvIfNoCase User-Agent ^$ bad_bot #leave this for blank user-agents
SetEnvIfNoCase User-Agent "^CatchBot" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot

<Limit GET POST HEAD>
Order Allow,Deny
Allow from all
Deny from env=bad_bot
</Limit>