when a webpage is generated in real time which memory it uses server-side or client side?
I have written a php code which will ge开发者_运维技巧t one id from database and using that id it will use some API's provided by other websites and generate a page.
here my question is where this generated page will occupy the space on the server or on the client machine?
if 10000 people will open the same page then will my server be slow down in this case.
should i store all data for that API in our MySQL-database.
what will make it fast & safe...
Please suggest me...
Thanks
I have written a php code which will get one id from database and using that id it will use some API's provided by other websites and generate a page. here my question is where this generated page will occupy the space on the server or on the client machine?
The generated page will occur on the client if you only fetch one id from your database. For this you could first do a jquery.get to fetch id from your server. Next you could get data from other API's using JSONP(JSON with padding). But for this to work the API's off course need to support JSONP, because the javascript clients can't fetch data using jquery.get because of same origin policy, but lucky JSONP can be used for that. Finally you could just easily append data to the DOM using .html. You should be carefull doing this with other API's and need to be sure these are safe API's because else you would be vulnerable to XSS. If you are not certain you should use .text instead.
should i store all data for that API in our MySQL-database.
It depends if the API's do provide JSONP.
what will make it fast & safe...
Fast
- APC to cache compiled bytecode. This will speed up your website tremendously without even changing a single line in your code-base.
- in memory database as redis or memcached. You can also use APC to store data in memory. This will speed up your website tremendously, because touching the disc(spinning the disc to right sector, etc) is very expensive and using memory is very fast.
- The No-Framework approach will make your site fast, because PHP is dynamic language you should try to do as little as possible.
- Tackle low hanging fruit only. Remember that "Premature optimization is the root of all evil". Rasmus Lerdorf teaches you how to do this in this video Simple is Hard from DrupalCon 2008. The slides are available at PHP's talks section
Safe
- Read up OWASP top 10
- Protect against XSS using filter
- Protect against SQL-injection using PDO(prepared statements).
- Protect against CSRF
It all depends on your garbage collection. The memory will be used by your server while the page is being rendered, but once the output is sent to the browser, PHP will no longer care. Now, if you have really bad garbage collection, Apache can certainly run out of memory. It has built-in garbage collection protocols but if you rely on those, you're just asking for dropped packets and page hangs.
If 10000 people access your server at the same time, it'll likely be your CPU that will be the bottleneck.
This is why tried-and-true PHP frameworks are ideal for large projects because most of them have taken all this into account and have built-in optimization implementations.
It depends really. Factors are:-
- Time taken to generate request's response
- Size of the request
- Concurrent connections
- Web Server
- Speed of the api
and many more... You server is not likely to slow down if there are 10000 requests made of a period of time but if there are 10000 requests made every second then there is going to be a likely impact and this depends on the list given. If there are more concurrent connections to the server then each connection will use up some memory and memory overflow may halt the server. So make sure that even you get that many requests those requests are served fast and their connections and processes are not kept in the memory for long. This saves memory and from your server crashing.
However if the output for the api is going to be the same for various users then it would be wiser to keep the object in the memory as memory access is much faster than a disk access.
If 10000 people will be grabbing the same page that you're dynamically creating by manipulating another site's API, it sounds like you're pulling data from the other site, and constructing a page using PHP on your server. So yes, that consumes a small amount of memory and processing resources on your system, per hit. Memory use may be limited by the number of threads or forks your webserver is allowed to use. Processing power will not be limited artifically; it will be constrained by what your server can handle.
But back to that number of 10000 people grabbing the same page, again. If that's a possibility you would want to generate the page locally, and cache it somehow so that it only has to be generated once. It doesn't make sense to generate the same output 10,000 times when you could generate it once and let it be fetched 10,000 times instead. Then it just becomes a matter of deciding when the cache is stale.
精彩评论