• Feed RSS

Secure Your WordPress Against User-Agents and Bots

"Lately there have been a lot of WordPress sites compromised only due to the bots that roam the world wide web! There are a lot of plugins out there which can protect your WordPress baby by blocking these “roguish” bots!
In this article you will be learning an easy and useful method of adeptly configuring your .htaccess file to filter these bots which can infect your website and can eat up your server resources. So get your .htaccess file ready for editing!


Step 1 Preparing the Code

The code mainly consists of bot names. I have added the most famous bots in here that I can think of. If there is some bot missing, please mention it in the comments.
The code is pretty straightforward. Go ahead and copy the code below and paste it in your .htaccess file.
# Bot Blocker
<IfModule mod_setenvif.c>
 SetEnvIfNoCase User-Agent ^$ keep_out
 SetEnvIfNoCase User-Agent (pycurl|casper|cmsworldmap|diavol|dotbot) keep_out
 SetEnvIfNoCase User-Agent (flicky|ia_archiver|jakarta|kmccrew) keep_out
 SetEnvIfNoCase User-Agent (purebot|comodo|feedfinder|planetwork) keep_out
 <Limit GET POST PUT>
   Order Allow,Deny
   Allow from all
   Deny from env=keep_out
 </Limit>
</IfModule>

Step 2 Testing the Code

To see whether the code is doing its job, I using recommend this website Bots VS Browsers. This website is a good place to simulate these types of attacks. Once on their website all you have to do is select any bot from the code, which you just added to your .htaccess file, and use that as the user agent. Enter the URL of your site and hit enter. If you see a “403 Error” this means that the code is doing its job. If not the code must’ve gotten messed up while being copied into your .htaccess file, so try again.

Step 3 Adding More Bots

Now you are familiar with the code and how to test it, we can add more bots to the code. You must have noticed the repetition in the code, and by using the same logic, you can add a dozen more bots to be blocked by setting the same parameters. Cool huh!
SetEnvIfNoCase User-Agent (i-IS-evilBOT) keep_out
As you can see in the code above, now I am blocking the “i-IS-evilBOT” (which I just made up). Other than that the name of the bot is not case sensitive and you can add it as per your liking. Go to the Bots VS Browsers page and this time enter the user agent which I just created, and voila, you’ll see that this user agent which was added to my .htaccess file is also blocked! You can add as many bots as you want to be blocked separated with a pipe character “|”

Conclusion

I said in the beginning that there are many plugins which can do the same thing and you can avoid this editing. But by manually editing the .htaccess file you can effectively block bad user-agents and bots with better efficiency and better site performance!"