开发者

Javascript Database Mass Insert

I am trying to insert over 70,000 rows into a javascript database (using Chrome 5.0.317.2). The inserts are taking a very long time. The actual page loads in a few seconds, and I can see progress as the percent increases very slowly as each row is inserted. It took about an hour to finish inserting all the records. Is there a way to optimize the inserts, or somehow start out with a preloaded SQLite database?

<script src="jquery.1.3.2.min.js" type="text/javascript" charset="utf-8"></script>
<script type="text/javascript" charset="utf-8">
// Truncated to 1 row for example. There are really 76547 rows.
var zipcodes = var zipcodes = [{"city_name":"AMHERST","city_alias":"AMHERST","zipcode":"01002"}];
var db;
function openMyDatabase() {
    var shortName = 'mydb';
    var version = '1.0';
    var displayName = 'mydb';
    var maxSize = 65536;
    db = openDatabase(shortName, version, displayName, maxSize);
    db.transaction(
        function(transaction) {
            transaction.executeSql(
                'CREATE开发者_高级运维 TABLE IF NOT EXISTS zipcode ' +
                '  (id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, ' +
                '   city_name TEXT NOT NULL, ' +
                '   city_alias TEXT NOT NULL, ' +
                '   zipcode TEXT NOT NULL)'
            );
        }
    );
    $.each(zipcodes, function(i, zipcode) {
        insertZipcode(zipcode.city_name, zipcode.city_alias, zipcode.zipcode, i);
    });
}

function errorHandler(transaction, error) {
    alert('Oops. Error was '+error.message+' (Code '+error.code+')');
    return true;
}

function insertZipcode(cityName, cityAlias, zipcode, i) {
    db.transaction(
        function(transaction) {
            transaction.executeSql(
                'INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);',
                [cityName, cityAlias, zipcode],
                function(){
                    $('#counter').html((100 * i / zipcodes.length) + '%');
                },
                errorHandler
            );
        }
    );
    return false;
}

$(function() {
    openMyDatabase();
});
</script>

Solution: On the PHP side, I made an associative array and used the zip code as the key and an array of cities as the value, and I ran it through json_encode and passed that to the javascript. On the javascript side I was able to very quickly get a list of cities for a particular zip code by using the following code:

var zipcodes = {"55437":["MINNEAPOLIS","BLOOMINGTON"]}; //truncated
alert('Cities in 55437: ' + zipcodes['55437'].join(', '));


One problem I can see is that you are trying to insert one row at a time, this can cause a lot of overhead in making connections etc...

It would be faster if you could insert multiple rows (maybe 20 or 50 in one shot) in one go. You can insert multiple rows by using some effecient procedure or INSERT INTO or something..


If you can't move it to something server-side (Javascript is really not a tool for a job like that), definitely, bundle multiple inserts together like Suraj suggests. 90% of the work is start connection, start transaction, end transaction, close connection. 10% are actual DB operations.

transaction.executeSql('
            INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
            INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
            INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
            ... //20-50 lines like this, maybe generated by a loop.
            ',[
            cityName1, cityAlias1, zipcode1,
            cityName2, cityAlias2, zipcode2,
            cityName2, cityAlias3, zipcode3,
            ... // a matching table, generated by a loop as well.
            ],
            ...


Why not use a preloaded XML instead of creating all the fields when the webpage loads? That way you will reduce the loading time, and the searching time could be reduced by some type of indexing, maybe hashtable indexing or binary search.

This would reduce flexibility, in means that you will have to change the XML and compile it with the help of a tool - Which I don't know if something like that exists; but will allow for better performance, specially if you are working in a limited device like an IPhone.


What I did to overcome this problem was to first create a string containing one trasanction with all its executes and then run it using javascript eval method

jsonResponse = Ext.util.JSON.decode(result.responseText);
        jsonIndex = 0;
        var consulta = "DB.transaction(function (transaction){";
        while(jsonResponse[jsonIndex] != null){
            var ins = jsonResponse[jsonIndex].instruccion;
            ins = ins.replace(/&quot;/gi, "\"");
            consulta+= "transaction.executeSql('"+ins+"'); ";                
            jsonIndex++;
        }
        consulta+="});";
        eval(consulta);


I had the exact same problem. I found a blog post providing a possible solution. Here's the Link: http://blog.heldes.com/html5/sqlite-class-for-html5-database/

Good Luck


An hour is probably too long in any case but even if you reduce that by a lot you still have a significant wait. It will probably pay to spawn a new thread to handle this process separate from your UI thread to preserve responsiveness for the user.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜