Prevent AJAX flooding in Javascript
My site has a Javascript method that makes an AJAX request to add an item to cart without reloading the page and making a simple notification.
AddToCart()
However, using any Javascript console, I found you can flood thi开发者_开发问答s request with a simple statement:
while (true) {AddToCart()}
And eventually lock the server until the browser crashes. A more stable browsing environment could probably even lock the server indefinitely. So what would be the best way to protect against such an attempt?
Perhaps you should just define the function in a private namespace?
(function() {
function AddtoCart(){};
})();
That way you can't reference it through the console.
This of course is not bulletproof as anyone could just replicate the code or make HTTP requests to the URI.
You can't stop the HTTP requests but you can stop the page processing data possibly by implementing CSRF tokens so that it won't do the heavy processing unless the CSRF token matches, which is generated from your page which creates the CSRF based on variables like timestamp and such, so it can't be (easily?) reproduced.
They could do much more damage using ab
(Apache benchmark) with a high concurrency value, or they could just sit there hitting F5. You need a lower-level solution - rate limiting, by IP perhaps, or a one-use hash, or any number of other solutions.
There are lots of ways that servers protect themselves from rogue clients. In this particular case, "rate limiting" is probably appropriate where the server selects a maximum number of operations per minute from the client that it thinks it reasonable for a human to operate and when the rate of operations from one client exceeds that it protects itself. How it chooses to protect itself depends. It might immediately fail each new request for awhile to keep from using many server resources, it might log the client out, it might fail silently or return an error.
Servers should know that real protection against this type of thing has to be done at the server because ajax calls can be done by anyone, not just your own client code.
On the client, you could protect from rogue javascript being injected a number of ways. Down lower in your code, you could also implement rate limiting (like right before you make the actual ajax call) and refuse to carry out more than X ajax calls per minute. This doesn't fully protect your server, but protects you from your own AddToCart() function being used in this way.
Or, you could make it so there is no top level global namespace function that requires no parameters that can be called this way. You could do this either by removing the relevant functionality from the global namespace (make it a method on one of your objects that requires a proper "this" pointer) or you could make the function require some relevant internal state that wouldn't always be known.
Personally, I don't really fell like a client needs to be protected from abuse that its owner might inflict on it when there's no legitimate purpose for what's being done other than to cause mayhem. If the user wants to do bad things that crash their own client, that's fine. They can bring down the client with task manager if they want. You do want to protect it from spraying your server with bad stuff and protect it from anything bad that might happen with legitimate normal user operations, but if the user wants to take down their own client, I'm not going to lose any sleep over that.
A request is a request, AJAX or not. The same rules apply for a regular DOS attack. There's nothing to stop people from calling your URL directly, even without AJAX.
Someone clever enough to figure out your code, open their browser's console, and type while (true) {AddToCart()}
doesn't even need a browser (or your code) – they could just execute wget
in an infinite loop, or if the goal is really a DoS, use a script for that purpose.
On the server side, you're dealing with how to mitigate a denial of service attack. There are many strategies; using the Nginx reverse proxy is the first that popped into my mind.
One thing you could do is make the AddToCart
function only do the request if one is not already in progress.
Another think you can do is obfuscate the code (there are tools to do this, do a search for javascript obfuscation) so its not obvious what method does what.
Those two methods will help, but won't solve the problem entirely. The server really needs to detect if its getting spammed with requests from one client and limit them, via a rate limiter.
精彩评论