使用伪静态规则拦截蜘蛛访问
作者:yjdl    发布于2013-11-21 19:56:56   浏览:  


 

Linux下 规则文件.htaccess(手工创建.htaccess文件到站点根目录)

<IfModule mod_rewrite.c>
RewriteEngine On
#Block spider
RewriteCond %{HTTP_USER_AGENT} "Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot
|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl
|Python|Wget|Xenu|ZmEu" [NC]
RewriteRule !(^robots\.txt$) - [F]
</IfModule>

windows2003下 规则文件httpd.conf   (在虚拟主机控制面板中用 “ISAPI筛选器自定义设置 "  开启自定义伪静态 Isapi_Rewite3.1 )

 

#Block spider
RewriteCond %{HTTP_USER_AGENT} "Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot
|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl
|Python|Wget|Xenu|ZmEu" [NC]
RewriteRule !(^/robots\.txt$) - [F]

windows2008下 web.config

<rule name="Block spider">
      <match url="(^robots\.txt$)" ignoreCase="false" negate="true" />
      <conditions>
        <add input="{HTTP_USER_AGENT}" pattern="Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot" ignoreCase="false" />
      </conditions>
      <action type="CustomResponse" statusCode="403" statusReason="Forbidden" statusDescription="Forbidden" />
</rule>



注:规则中默认屏蔽部分不明蜘蛛,要屏蔽其他蜘蛛按规则添加即可
附各大蜘蛛名字:
google
蜘蛛:googlebot
百度蜘蛛:
baiduspider
yahoo
蜘蛛:
slurp
alexa
蜘蛛:
ia_archiver
msn
蜘蛛:
msnbot
bing
蜘蛛:
bingbot
altavista
蜘蛛:
scooter
lycos
蜘蛛:
lycos_spider_(t-rex)
alltheweb
蜘蛛:
fast-webcrawler
inktomi
蜘蛛:
slurp
有道蜘蛛:YodaoBot
OutfoxBot
热土蜘蛛:
Adminrtspider
搜狗蜘蛛:
sogou spider
SOSO
蜘蛛:
sosospider
360
搜蜘蛛:360spider

 

 



脚注信息
版权所有 Copyright(C)2009-2012