Usually it's a fight comes to setting different antispam extensions such as CAPTCHA, control issues and so on, which is not enough that is not highly effective (bots long ago have learned to get around them), so also does not solve the problem with increasing load, and some cases exacerbate it.
One effective method of protection is to rename the standard forms on the site that receives the data. Such methods Much has been written on the network, they really cut off almost all the bots, but yet again, there are significant drawbacks:
- The load on the site is not reduced, as requests still come in and handled the site.
- This method typically involves significant changes in the files of the engine, which can significantly complicate the process of updating and cause other unpleasant consequences.
From our point of view, the most effective method of screening is spam bots based on the analysis peredavaevyh them in querying. Example Protection Forum with phpBB 3 you can see in the article Habré . The whole idea is to ensure that the conduct analysis of queries to the site from ordinary users from bots and then compare them and find them at least one difference.
Next, write a little script that will be on the basis of this difference to catch and cut off from the site bots, giving them an error 404. And already bots will not deal with the site, and our little script that will reduce the burden on the bots to almost zero. Another important advantage is that there is no need to install on the website and other various CAPTCHA test questions that will greatly simplify life for your users. A bots before them did not even reach.
A similar method can be cut bots from virtually any site, that is, forbid them to register, post comments, and even protection from password guessing to the administrator of the site.
All you need - is to compare queries find the difference and write a script to give an example. Naturally, if you do this you can not do - you can always ask for help from our experts .