如何在 Apache 根级别阻止某些 robots 和 referer 页面

如何在 Apache 根级别阻止某些 robots 和 referer 页面

当我使用 Lighttpd 时,我可以通过这样的条目轻松实现这一点。因此所有网站都受到了保护。

Wget 机器人:

$HTTP["useragent"] =~ "Wget" {
        $HTTP["url"] =~ "^/tagi(.*)" {
#       $HTTP["url"] =~ "" {
                url.access-deny = ( "" )
        }

        $HTTP["url"] =~ "^/tags(.*)" {
                url.access-deny = ( "" )
        }

        $HTTP["url"] =~ "^/kom.php(.*)" {
                url.access-deny = ( "" )
        }

        $HTTP["querystring"] =~ "^(.*)strony(.*)" {
                url.access-deny = ( "" )
        }

        $HTTP["querystring"] =~ "^(.*)page(.*)" {
                url.access-deny = ( "" )
        }

        $HTTP["url"] =~ "^(.*)/www/delivery/lg.php(.*)" {
                url.access-deny = ( "" )
        }

        $HTTP["url"] =~ "^(.*)/reklamy/(.*)" {
                url.access-deny = ( "" )
        }

        $HTTP["url"] =~ "^(.*)/ads/(.*)" {
                url.access-deny = ( "" )
        }

        $HTTP["url"] =~ "^(.*)/www/delivery/ck.php(.*)" {
                url.access-deny = ( "" )
        }
  }

存在虚假流量的网站:

$HTTP["referer"] =~ "(.*)surfing.php(.*)" {
        url.access-deny = ( "" )
}

$HTTP["referer"] =~ "(.*)promote.php(.*)" {
        url.access-deny = ( "" )
}

$HTTP["referer"] =~ "(.*)trafficadder.php(.*)" {
        url.access-deny = ( "" )
}

$HTTP["referer"] =~ "(.*)traffic.php(.*)" {
        url.access-deny = ( "" )
}

$HTTP["referer"] =~ ".*loic*." {
       url.access-deny = ( "" )
}

$HTTP["referer"] =~ ".*autosurf*." {
       url.access-deny = ( "" )
}

如何在 Apache 中执行相同操作?我不想将其添加到 .htaccess。

答案1

您可以使用 mod_rewrite,但需要花费一些功夫。以下是一些起点:

http://httpd.apache.org/docs/2.4/rewrite/access.html

请特别注意“阻止机器人”部分: http://httpd.apache.org/docs/2.4/rewrite/access.html#blocking-of-robots

也可以看看: http://en.linuxreviews.org/HOWTO_stop_automated_spam-bots_using_.htaccess

相关内容