I’m trying to implement some protection against automated attacks on my website (mostly via scripts, probing for SQL injection, etc.) So I’m checking if two consecutive visits from the same IP fall within a certain time range. And if so, I then show a blank page with a short sentence that the frequency of visits is too high.
But now I’m wondering, say, if Google, Bing, Duck-Duck-Go, or other reputable search engines visit a website, do we know time interval between page visits when they crawl a site?
For instance, they won’t start loading all pages of the website all at once, or within a short span of time (say, measured in seconds.)
PS. By saying reputable I mean search engines that play by the rules that they clear post on their websites.