Which is better in parsing large data
I'm having an array of data in PHP. It is a lot of data of a particular format. It is of the form of a tree
. That is the array contains n dimensions. I need to parse the data into editable html format by binding the data with inline editing
feature with ajax save.
Think it of as a very huge form.
While getting the data from the server which method is better?
Encoding this data into JSON, make the transfer size s开发者_JAVA技巧mall and parse it using a DFS(Depth First Search) function using Javascript.
Parse the data using DFS on the server side using PHP, and get the formed DOM at the client side.
In the first method, the data is small enough to be transferred even in a slow connection, but there will be lots and lots of reflow accessing the DOM that many number of times.
In the second method, the data is more than twice as large as it was in the first method... (i.e. adding all the tags and attributes). But the DOM is not disturbed at all in the client side.
The JSON would consist of many short data, that makes it large. It doesnt contain paragraphs. It contains short strings and boolean data, but in a tree data structure.
Which can I sacrifice? DOM or the amount of transfer. Is there any better method of transfer ? Or am I wrong at some place?
In theory there should be no much difference in size between JSON and formed DOM when gzip compression is on.
精彩评论