Looking for API optimization suggestions
In the past I had asked a few questions about how to create my own API to allow external sites access to my db I store in the database. This particular question is I'm asking for suggestions on how to optimize the API access for external sites to help keep the local server load down. First the way the API is setup is it uses MySQL that holds the data and php to serve the data. The API itself is called by a URL with an API key in the URL. For example:
http://mysite.com/get_listings.php?key=somekey
Then my script does what it needs to do for the action above and then return the results in json format. Currently the API is serving 10-15k unique calls a month and I am adding another group of sites that will bring in another 20-30k unique calls a month on top of what the API is serving now. There is no caching going on, its just a straight call to my API via the php script and it queries the MySQL database every call. The type of data that it returns is basically stories and profile information and the calls are things like latest stori开发者_运维技巧es, profile of the story author and story details.
Besides the normal optimization of php code and optimizing MySQL queries and indexes can you suggest any other optimization techniques that can help keep the server load down and still serve the API to the external sites fast? I understand the external sites themselves should be doing some sort of caching on their end as well. But on my end do you suggest caching the results from the db then serving the cache? If so would using memcache be a good choice? Or perhaps storing the json results in redis? Or perhaps a simple file caching system would work? Anything other than caching? Obviously server hardware can make a difference but out of the scope of this post. I will say though that I will be getting a whole new server that will only be dedicated to doing this.
I do have root access to install additional software, if that helps.
Unless you have the capacity to scale mysql with additional read-only nodes, you should really start using Redis or Memcached. Redis might be better at handling complex data-types. Here is a recipe.
- https://github.com/nicolasff/phpredis - install this, it will help you to interface with Redis seamlessly from PHP.
- Classify your requests by certain keys. For example get_listings.php?key=somekey would probably use $redis->hget('listings',$somekey);
- Store results serialised or in JSON.
- When database is updated, be sure to destroy or update relevant indexes in Redis. If you are using back-end with MVC, then model behaviour handlers could do that for you. For example when new comment is posted, update Redis sorted list.
- You can mix and match different structures. For example fetch the key from sorted list, then pull the actual data from the hash.
If you are using redis for permanent storage, make sure that your PHP knows how to clean it up. In my experience it turned out to be quite resilient, although I still store data in MySQL .
Redis is great if you understand computation complexities.
精彩评论