More than 640 000 elements in the array - memory problem [Dijkstra]
I have a script which puts 803*803 (644 809) graph with 1 000 000 value inside each. With ~500*500 everything works fine - but now it crashes - it tries to allocate more than 64MB of memory (which I haven't). What's the solution? Somehow "split" it or...?
$result=mysql_query("SELECT * FROM some_table", $connection);
confirm($result);
while($rows = mysql_fetch_array($result)){
$result2=mysql_query("SELECT * FROM some_table", $connection);
confirm($result2);
while($rows2 = mysql_fetch_array($result2)){
$first = $rows["something"];
$second = $rows2["something2"];
开发者_如何转开发 $graph[$first][$second] = 1000000;
}
}
*it's about Dijkstra algorithm
p.s. no, I can't allocate more than 64MB
Try freeing your inside sql result at the end of each loop, using mysql_free_result($result2);
, the PHP script may not do it for you, depending on the PHP version (the garbage collector may not be enabled or may be useless due to a too old PHP version).
Do not instanciate the two temporary variables inside the loop, use the mysql_fetch_array result directly such as $graph[$rows["something"]][$rows2["something2"]] = 1000000;
, you will save 2 memory allocations per loop..
PS: This is micro-optimization, therefore it may help you to save enough memory to fit into your 64M of memory. Don't forget that with 64 * 1024 * 1024 bytes of memory, you have an average 104 bytes maximum size for each of your 644 809 elements, plus the array size itself, plus the rest of the temporary data you may allocate for your algorithm.
If it doesn't fit, consider splitting your matrix and doing batched jobs or such to split your work in lesser memory consuming but more than one script run.
If your above code sample actually matches your real code, you're fetching the same result two times (the second one even in a loop). If it's the same dataset fetching it once from the database should be enough and will reduce database-load, execution time and memory footprint altogether.
Perhaps the following approach may work in your memory-restricted environment.
$result = mysql_unbuffered_query("SELECT * FROM some_table", $connection);
confirm($result);
$rawData = array();
while ($rows = mysql_fetch_assoc($result)) {
$rawData[] = array($rows["something"], $rows["something2"]);
}
mysql_free_result($result);
$graph = array();
foreach ($rawData as $r1) {
foreach ($rawData as $r2) {
$graph[$r1[0]][$r2[1]] = 1000000;
}
}
unset($rawData);
Notes:
- I'm using
mysql_fetch_assoc()
instead ofmysql_fetch_array()
because the latter will return every column twice (one numerically indexed and one indexed by the column name) - Perhaps using
mysql_unbuffered_query()
instead ofmysql_query()
might also reduce the memory footprint (depending on the actual dataset size)
Try using http://en.wikipedia.org/wiki/Adjacency_list to represent the graph instead of Adjacency Matrix ( i think you are using matrix cause of $graph[$first][$second] = 1000000;
For a sparse graph, it takes less memory.
If you insist on using PHP for high memory operations (which is not really a good idea to begin with), I would partition the graph into quadrants, and use GD to combine the quadrants. That way you will only have to build the graph with 1/4 the memory footprint.
Again, this is not ideal, but you're trying to use a nail to drive in a hammer :D
精彩评论