Synchronous request for a bunch of JavaScript files. sleep() in JavaScript
I want to allow my JavaScript packets to request (synchronous ajax calls) for other JavaScript packets:
// 2 sequential synchronous calls
framework.require('packet1'); // time1 ms
framework.require('packet2'); // time2 ms
// use packets
These 2 requests will be processed for time1+time2 milliseconds. So there is another idea: make those requests asynchronously each, but guarantee that the whole bunch of requests will be processed synchronously:
// 2 parallel asynchronous calls, synchronous in total
framework.require([
'packet1', // time1 ms
'packet2' // time2 ms
]);
// use packets
As far as I understand, this should be faster. Now let's see my implementation:
framework = {
require_counter:0,
require: function(arr)
{
framework.require_co开发者_如何学运维unter = arr.length;
var success = function(data, textStatus, jqXHR)
{
framework.require_counter -= 1;
}
var error = function(jqXHR, textStatus, errorThrown)
{
framework.require_counter -= 1;
// some error notification
}
// asynchronous calls
for (var i = 0; i < arr_js.length; i++)
$.ajax({
url: arr_js[i],
success: success,
error: error
});
// wait
while (framework.require_counter > 0)
framework.wait();
// finally return
}
}
The trickiest part is to implement that wait() function. JavaScript does not provide one; another solution should be found. Asynchronous setTimeout() is not the answer. That is my question: how to implement wait() function? Or maybe there is another general solution?
I've tried this approach: http://narayanraman.blogspot.com/2005/12/javascript-sleep-or-wait.html. Maybe you will point out more "laconic" solution without server-dependency.
If you really need synchronous requests, you can usually direct the underlying AJAX mechanism to do so. For example, in jQuery, you set the async
setting to false. When using XMLHttpRequest, you pass false as the third parameter to open
(which is how Raman implements sleep
in the blog post to which you link). However, SJAX is usually a Bad Thing as it blocks JS execution in current browsers. Better is to use continuation passing style, passing a function that represents the rest of the computation to run:
framework = {
require: function(urls, done)
{
function success(data, textStatus, jqXHR)
{
//...
}
function error(jqXHR, textStatus, errorThrown)
{
//...
}
urls.unshift('');
function load() {
urls.shift();
if (urls.length) {
$.ajax({
url: urls[0],
// here are some continuations
complete: load,
success: success,
error: error
});
} else {
// here's a continuation invocation
done();
}
}
}
};
If instead of synchronicity you need to perform some action when all the given resources are loaded, you could use a continuation for the action but not for resource loading. Instead, you'd just apply an event handling model, recording the completion of each load and invoking the dependent action when all resources requests have finished.
framework = {
require: function(urls, done)
{
var remaining = urls.length;
var failed=[];
// The following implementation is vulnerable to race conditions.
// Good thing JS isn't generally multithreaded. If (when) it is,
// and if the browser doesn't offer some form of synchronization,
// then Peterson's algorithm should work correctly if inefficiently.
function success(data, textStatus, jqXHR) {
if (!--remaining) { // critical section
done(failed);
}
}
function makeErrorHandler(url) {
return function (jqXHR, textStatus, errorThrown) {
failed.push(url);
if (!--remaining) { // critical section
done(failed);
}
}
}
for (var i=0; i < urls.length; ++i) {
$.ajax({
url: urls[i],
success: success,
error: makeErrorHandler(urls[i])
});
}
}
};
精彩评论