开发者

What is procedure for stopping robots and malicious scanners that slow down a site?

What should i do to prevent users from running scanners or auto posting robots against my site that would slow down the site processing?

Is it sufficient to time开发者_JAVA百科stamp each post a user makes and create a posting delay? How long of an interval should there be?

What else can I do besides te above and captchas on form posts?

thanks


A time interval is a good idea and is used on Stack Overflow. Different operations should have different time limits depending on:

  1. How often ordinary users are likely to want to use that feature.
  2. How intensive the operation is.

If you have an operation that requires a lot of processing time, you might want to set the limit on that operation higher than for a relatively simple operation.

Stack Overflow combines time limits with CAPTCHAs for editing posts. If you edit too frequently you have to pass a CAPTCHA test.


I googled this a year ago or so and found a list of known "bad useragents", which I added to my .htaccess to block those from accessing my blog. this small change had a significant impact on my bandwidth usage.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜