CleanTalk Anti-Flood and Anti-Crawler(Bot protection) Options
CleanTalk SpamFireWall
SpamFireWall(SFW) is an additional option and the part of the CleanTalk Anti-Spam plugin that blocks access to the website with most spam active IP addresses (blocks Get requests). Spambots are blocked by website FireWall before they get access to the website, it prevents the loading of pages of a website spam bots, so your web server doesn't need to run all the scripts on these pages. This can reduce the load on the database and web server.
If Varnish is used on your server, then the SFW may affect the site loading speed, in this case you need to disable SFW in the settings of the plugin.
Additional options of SpamFireWall: Anti-Flood and Anti-Crawler(Bot Protection) are designed to block unwanted bots that can search vulnerabilities on a website, attempt to hack a site, collect personal data, price parsing or content and images, generate 404 error pages, or aggressive website scanning bots. Aggressive website scanning bots can load website servers a lot therefore the speed of websites can go down as well. As a result, Google Ranking System will lower the website position.
Learn more about what problems bad bots can cause on your site on our Blog.
If you need to allow access for any bots that are not in the exceptions, you can use your own white lists allowing any IP to visit your website.
More information about white lists is here: https://cleantalk.org/help/sfw-blacklist-usage.
How to manage your Private list by User-Agents: https://cleantalk.org/help/filter-ua
How the Options Work
CleanTalk Anti-Crawler(Bot Protection) — this option is meant to block all types of bots visiting website pages that can search vulnerabilities on a website, attempt to hack a site, collect personal data, price parsing or content and images, generate 404 error pages, or aggressive website scanning bots. Aggressive website scanning bots can load website servers a lot therefore the speed of websites can go down as well. As a result, Google Ranking System will lower the website position.
This option has a list of exceptions so as not to block requests from bots like Google, Bing, Baidu bots and etc.
You can see the full list on the help page https://cleantalk.org/help/filter-ua. These bots are whitelisted by default, but you can change their status for your site and block selected bots. This option is disabled by default, you can enable it in the plugin settings, WordPress Dashboard -> Settings -> Antispam by Cleantalk ->Advanced settings->Anti-Crawler(Bot Protection).
The initial visit of your website by any IP triggers the check for spam, if the check fails then the visitor will be shown the Anti-Crawler(Bot Protection) blocking screen on the second request to your site within 3 seconds. So the bot will not be able to pass the verification and leave the blocking page. The screen has the spam checking mechanism so the visitor will be transferred to the website if everything goes OK when the time runs out.
If you have any issues with false positives then you need to disable the Anti-Crawler(Bot Protection) option. These problems may be related to the use of cookies on the site or the cache plugins. Please, set the option "Set cookies" to "Use alternative mechanism for cookies" or to "Off" and switch the option "Add a CleanTalk Pixel to improve IP-detection" to "Via JavaScript".
If this doesn't help and you want to use this option further, then you need to contact our technical support https://cleantalk.org/my/support.
CleanTalk Anti-Flood — this option is meant to block aggressive bots. You can set the maximum number of website pages your visitors can click on within 1 minute. If any IP exceeds the set number it will get the CleanTalk blocking screen for 30 seconds. It's impossible for the IP to open any website pages while the 30-second timer takes place. When the timer ends the IP will be able to continue visiting your pages and the Anti-Flood option starts counting the number of page visits again.
For example, by default it's 20 visits per 1 minute. That means that any visitor who opens 20 website pages within 1 minute will be blocked for 30 seconds and they will not be able to see your website during that time.
This option has a list of exceptions so as not to block requests from bots like Google, Bing, Baidu bots and etc.
You can see the full list on the help page https://cleantalk.org/help/filter-ua. These bots are whitelisted by default, but you can change their status for your site and block selected bots. This option is disabled by default, you can enable it in the plugin settings, WordPress Dashboard -> Settings -> Antispam by Cleantalk ->Advanced settings->Anti-Flood.
You can set your number of how many website pages an IP can visit before getting blocked. The option is in the CleanTalk plugin settings: Anti-Spam by CleanTalk —> Settings —> Advanced Settings —> Anti-Flood Page Views Limit.
Statistics of blocks for the options Anti-Flood и Anti-Crawler is available in the SpamFireWall Log: https://cleantalk.org/my/show_sfw
It would also be interesting
- Anti DDoS Lite. DDoS Protection & MitigationHow to protect your website from DDoS attacks or bot crawls Anti-DDoS-Lite (Anti-Crawler app) is a small...
- User-Agent Filtration with Anti-CrawlerFiltration by User-Agents in the Anti-Crawler Option We've extended the feature for working with...
- Blocking crawler bots by user-agentHow To Block Bots By User-agent Why you should block some crawling bots The activity of...