开发者

Causing two things to load in parallel?

I'm writing some PHP that does a fair amount of processing and then generates reports of the results. Previously it would do a periodic flush() but we're moving to Zend Framework and can't do that anymore. Instead, I would like to have some kind of status display that updates while the report is generated. So I made a progress bar that loads in an iframe, added shared memory to the progress bar update action and the report generation action, and caused the output to load via xmlhttprequest. This all works fine. My issue is that the browser wants to do the two requests serially instead of in parallel, so it will request the progress bar and then BLOCK until the progress bar completes BEFORE it requests the actual output. This means that the process will never end since the real work never starts.

I've searched all morning for some way around this and came up empty-handed. Is there some way to cause two connections, or am I just screwed?

My next action will be to break the processing apart some more and make the status updating action do the actual work, save the result, and then use the other action to dump it. This will be really painful and I'd like to avoid it.

Edit: Here is the javascript, as requested:

    function startProgress()
    {
        var iFrame = document.createElement('iframe');
        document.getElementsByTagName('body')[0].appendChild(iFrame);
        iFrame.id = 'progressframe';            
        iFrame.src = '/report/progress';
    }

    function Zend_ProgressBar_Update(data)
    {
        document.getElementById('pg-percent').style.width = data.percent + '%';
        document.getElementById('pg-text-1').innerHTML = data.text;
        document.getElementById('pg-text-2').innerHTML = data.text;
    }

    function Zend_ProgressBar_Finish()
    {
        document.getElementById('pg-percent').style.width = '100%';
        document.getElementById('pg-text-1').innerHTML = 'Report Completed';
        document.getElementById('pg-text-2').innerHTML = 'Report Completed';
        document.getElementById('progressbar').style.display = 'none'; // Hide it
    }

    function ajaxTimeout(){
        xmlhttp.abort();
        alert('Request timed out');
    }

    var xmlhttp;
    var xmlhttpTimeout;

    function loadResults(){
        if (window.XMLHttpRequest){
            // code for IE7+, Firefox, Chrome, Opera, Safari
            xmlhttp=new XMLHttpRequest();
        }else{
            // code for IE6, IE5
            xmlhttp=new ActiveXObject(\"Microsoft.XMLHTTP\");
        }
        xmlhttp.open(\"POST\",\"/report/output\",true);
        xmlhttp.onreadystatechange=function(){
            if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
                clearTimeout(xmlhttpTimeout);
                document.getElementById('report-output').innerHTML=xmlhttp.responseText;
            }
        }
        var xmlhttpTimeout=setTimeout(ajaxTimeout,600000); // Ten minutes
        xmlhttp.setRequestHeader('Content-Type','application/x-www-form-urlencoded');
        xmlhttp.send('".file_get_contents("php://input")."');           
    }

This gets called from the following onload script:

onload="startProgress(); setTimeout(loadResults,1000);"

The issue is not in Javascript. If you put an alert() in there, the alert will be triggered at the right time, but the browser is delaying the second http transaction until the 开发者_C百科first completes.


Thank you everyone for your input.

I didn't come up with a satisfactory answer for this within the timeframe permitted by our development schedule. It appears that every common browser wants to re-use an existing connection to a site when doing multiple transactions with that site. Nothing I could come up with would cause the browser to initiate a parallel connection on demand. Any time there are two requests from the same server the client wants to do them in a serial fashion.

I ended up breaking the processing into parts and moving it into the status bar update action, saving the report output into a temporary file on the server, then causing the status bar finish function to initiate the xmlhttprequest to load the results. The output action simply spits out the contents of the temporary file and then deletes it.


Using two async ajaxes could do the trick. With the first ajax request you should start the process by calling the php-cli to do the actual work deep in the background (so it doesn't expire or cancel) and return the id of the process (task). Now when you have the process id, you can start the periodical ajax to display the process made.

Making a db table containing process_id, state, user would not be a bad thing. In this case even if the user would close the browser while the process is running, the process would continue until done. The user could revisit the page and see the percentage done, because the process running in cli would save the progress into the db table.


Make a system call to the php file and detach it?

ex:

exec('nohup php test.php > test.out 2> test.err < /dev/null &');

echo 'I am totally printing here';

test.php contains a sleep for 2 seconds and prints, but echo returns immediately.

Have it store the results in a file/database/whatever. It will act like a very dirty fork.

You could also do something similar with a CURL call I bet if you have issues executing.

Credit here for the code example from bmellink (mine was way worse than his).


If you are able to load the report in the iFrame, you can kind of reverse your logic (I have done this to track file uploads to PHP).

  1. Load Report in iFrame (can be hidden or whatever you like).
  2. Make ajax call to get progress (step 1 will have to log progress as others have mentioned).
  3. When the progress reports loading complete, you may show the iframe or whatever is needed to complete.

Hope that helps. Just did a whole lot with iFrames, CORS, and Ajax calls to API's.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜