开发者

Visualizing large quantities of data on google maps / visualizations

I have a json file thats roughly 480mb of geolocation points. I was wondering if someone knows o开发者_运维技巧f a good 'pattern' to use when trying to visualise the data. The problem I'm encountering is that the data has to be loaded into Google maps from the get go. This is causing all kinds of obvious issues.

I don't have to do this through google. It just seemed like the obvious choice.


With that much data, it may make more sense to handle it on the server side instead of client side. You could set up Geoserver with your appropriate data points. Using OpenLayers, you could overlay your points from Geoserver on top of Google Maps or potentially even on top of your own map if you want to cut out Google Maps all together. The heavy duty processing then happens on the server and only images are displayed in the browser. This cuts down on network traffic and the amount of processing the browser has to do. If you set up Geoserver to do caching, the server won't even have to work very hard.


It really depends on what kind of data this is.

If these are points for polylines or polygons you might try to encode the points (http://code.google.com/apis/maps/documentation/utilities/polylinealgorithm.html and http://code.google.com/apis/maps/documentation/utilities/polylineutility.html). There are also functions which you can use to encode the points. This will significantly reduce the size of your data.

You might also want to consider loading data depending on zoom level on the map. (I am not sure what you mean by "data has to be loaded from the get go" - you can load the data into the map depending on events, etc...) .

Fusion tables mentioned above will only accept 100MB of data.

I can be more specific if you explain the nature of your data and what you trying to do in more details. Hope this helps.


Try Google Fusion Tables

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜