开发者

Use non-blocking streams to paralellize REST-Api requests in PHP?

Consider the following scenario:

  • http://www.restserver.com/example.php returns some content that I want to work with in my web-application.

  • I don't want to load it using ajax (SEO issues etc.)

  • My page takes 100ms to generate, the REST resource also takes 100ms to be loaded.

  • We assume that the 100ms generation time of my website occour before I begin working with the REST resource. What comes after that can be neglected.

Example Code:

Index.php of my website

<?
do_some_heavy_mysql_stuff(); // takes 100 ms
get_rest_resource(); // takes 100 ms
render_html_with_data_from_mysql_and_rest(); // takes neglectable amount of time
?>

Website will take ~200ms to generate.

I want to turn this into:

<?
Restclient::initiate_rest_loading(); // takes开发者_开发知识库 0ms
do_some_heavy_mysql_stuff(); // takes 100 ms
Restclient::get_rest_resource(); // takes 0 ms because 100 ms have already passed since initiation
render_html_with_data_from_mysql_and_rest(); // takes neglectable amount of time
?>

Website will take ~100ms to generate.

To accomplis this I thought about using something like this:

(I am pretty sure this code will not work because this question is all about asking how to accomplish this, and whether its possible. I just thought some naive code could demonstrate it best)

class Restclient {
    public static $buffer;
    public static function initiate_rest_loading() {
        // open resource
        $handle = fopen ("http://www.restserver.com/example.php", "r");
        // set to non blocking so fgets will return immediately
        stream_set_blocking($handle,0);
        // initate loading, but return immediately to continue website generation
        fgets($handle, 40960);
    }
    public static function get_rest_resource() {
        // set stream to blocking again because now we really want the data
        stream_set_blocking($handle,1);
        // get the data and save it so templates can work with it
        self::$buffer = fgets($handle, 40960); templates
    }
}

So final question:

  • Is this possible and how?

  • What do I have to keep an eye on (internal buffer overflows, stream lengths etc.)

  • Are there better methods?

  • Does this well work with http resources?

  • Any input is appriciated!

I hope I explained it understandable. If anything is unclear, please leave a comment, so I can rephrase it!


As "any input is appreciated", here is mine:

  • What you want is called asynchronous (you want to something while something else is being done "in the background").

To solve your problem, I thought on this:

  1. Separate do_some_heavy_mysql_stuff and get_rest_resource in two different PHP scripts.

  2. Use cURL "multi" ability to do simultaneous requests. Please, check:

    • curl_multi_init and related PHP functions
    • Simultaneous HTTP requests in PHP with cURL

This way, you can perform both scripts at the same time. Using cURL multi features, you can call http://example.com/do_some_heavy_mysql_stuff.php and http://example.com/get_rest_resource.php at the same time, and then play with the results as soon as they're available.

These are my first thoughts, and Iim sharing them with you. Maybe there are different and more interesting approaches... Good luck!

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜