׀  Submit  ׀  About  ׀  Contact  ׀
 
Controlling This Bot
Robots.txt
Meta Tags
htaccess
   
Discuss This Bot
Comment on this Bot
   
   
Bots By Type
Advertising
Bad Bots
Crawlers
Scrapers
   
Bots By User-Agent
A B C D E F
G H I J K L
M N O P Q R
S T U V W X
Y Z 0 1 2 3
4 5 6 7 8 9
   
Newest Bot Added
   
   
 
 
 
 
Home > User-Agent > CyberPatrol SiteCat Webbot

  CyberPatrol SiteCat Webbot 
 
Bot Description:

Cyber Patrol monitors online activity.  This is just a guess, but this bot may be checking websites to add them to a block list or to crawl sites that have been visited by those that use Cyber Patrol within their organization or at home.

http://www.cyberpatrol.com/




Have you dealt with this bot before?



Controlling this bot on your site:

Below are some methods to control the access that this bot (title) has to your site and or pages.  The methods below may not work if the bot does not pay attention to the limits you have established. Select a method: Robots.txt, Meta Tags, htaccess,


Using Robots.txt:

A Robots.txt file is placed in the root of your website.  Good bots will first look for and review the robots.txt file before either continuing on to other pages on your site or leaving if they are not allowed.  For more information about robots.txt files, visit robotstxt.org.

Do not allow any bot (user-agent) to access any part of your site
User-agent: *
Disallow: /


Allow any bot (user-agent) to access any part of your site
User-agent: *
Disallow:


Do not allow CyberPatrol SiteCat Webbot to access any part of your site
User-agent: CyberPatrol SiteCat Webbot
Disallow: /


Allow CyberPatrol SiteCat Webbot to access any part of your site
User-agent: CyberPatrol SiteCat Webbot
Disallow:


Allow CyberPatrol SiteCat Webbot to access your site, but CyberPatrol SiteCat Webbot is not allowed to access the "admin" folder
User-agent: CyberPatrol SiteCat Webbot
Disallow: /admin


Allow CyberPatrol SiteCat Webbot to access your site, but CyberPatrol SiteCat Webbot is not allowed to access the "admin" folder and the "photos" folder
User-agent: CyberPatrol SiteCat Webbot
Disallow: /admin
Disallow: /photos

        


Using Meta Tags:

You can use meta tags in your pages to help control the access bots have to your site.  If you use a template for all your pages, you can add the meta tags in between the
<head> and </head> and it will work on all the pages using that template.  If you want to control specific pages, you can add the meta tags on individual pages in between the <head> and </head> instead.

Allow all bots to access your page(s)
<meta name=”robots” content=”index” />

Allow all bots to access your page(s) and follow links on the pages
<meta name=”robots” content=”index, follow” />

Allow all bots to access your page(s) but do not allow them to follow links
<meta name=”robots” content=”index, nofollow” />

Do not allow any bots to access your page(s)
<meta name=”robots” content=”noindex” />

Allow CyberPatrol SiteCat Webbot to access your page(s)
<meta name="CyberPatrol SiteCat Webbot" content="index">

Do not allow CyberPatrol SiteCat Webbot to access your page(s)
<meta name="CyberPatrol SiteCat Webbot" content="noindex">

Allow CyberPatrol SiteCat Webbot to access your page(s) and follow the links to more pages
<meta name="CyberPatrol SiteCat Webbot" content="index, follow">

        


Using HTACCESS:

You may have the ability to add and or modify an htaccess file on your server.  The htaccess file can be used to control the bots at the server level.  Add the following to the .htaccess file on your server to block specific bots from visiting your site.  Be sure to replace the Enter User Agent with the user-agents for the bots you would like to block simliar to the user-agent (CyberPatrol SiteCat Webbot) listed in the example below. 

SetEnvIfNoCase User-Agent ^$ bad_bot #leave this for blank user-agents
SetEnvIfNoCase User-Agent "^CyberPatrol SiteCat Webbot" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot

<Limit GET POST HEAD>
Order Allow,Deny
Allow from all
Deny from env=bad_bot
</Limit>