Scaling limited API requests over time for large bundles of requests
I'm implementing Klout API with a web application I have been working.
Klout allows you to do 10 requests / second, and 10,000 / day.
Each user account can view N other user accounts with their Klout scores.
Would the best way to gather this information be to periodically request scores from Klout API as a background process and store these results in the database?
If I do this, let's say we have 10 users who have 30 other user's Klout scores they want to view.
In a daily background process, let's also say we go through each of the 10 users and look up each of their 30 followed user Klout scores.
This would ping Klout's API at a maximum of 300 times throughout this process (however you can include up to 5 users to look up their scores per request -- so the number of requests could be reduced to 60).
In order to avoid the 10 requests per second cap, should I sleep the process for about 10 seconds between each request? Would that be the best way to avoid any API errors?
Psuedo Example:
$maxUserPerRequest = 5;
foreach ($users as $user){
$follows = getKloutFollows($user); // list of klout scores to look up per user
$minimize = 0; // allow to minimize total requests by 5 users per req.
$batch = array() // array of 5 users to request
foreach($follows as $follow){
开发者_StackOverflow社区 $batch[] = $follow;
// query 5 users at a time
if ($maxUserPerRequest % $minimize==0){
$kloutAPI->getScore($batch); // storing results
$batch = array();
sleep(10); // avoid overloading API requests
}
$minimize++;
}
}
That should be enough to avoid getting the errors, but doing mass database appends and storing the data for more than 5 days would still violate the Terms of Service.
I would just keep a counter of each request. Make 10 requests. Sleep for 1000. Reset the counter. Make 10 requests. Sleep for 1000.
精彩评论