json php script running out when pulling big data
I have a query with 4000 object type of data needs to be pulled out and form a json result. When I run within 400 number of data, it's fine. But when I want to pull out more data. It gave me blank page - I suppose that the php is just stopped executing. But what I had waited is only 10 - 20 seconds. The time out setting might not be the issue. Is it something about memory. So I changed memory_limit to 512M and tried again, still, no change.
max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
;max_input_nesting_level = 64 ; Maximum input variable nesting level
memory_limit = 512M ; Maximum amount of memory a script may consume (128MB)
Then I 开发者_开发问答wonder if it's mysql related but I really don't know what to change...
key_buffer_size = 16K
max_allowed_packet = 16M
table_open_cache = 4
sort_buffer_size = 64K
read_buffer_size = 256K
read_rnd_buffer_size = 256K
net_buffer_length = 2K
thread_stack = 128K
I am on testing box so CPU, RAM won't be the issue.
Any idea what might be the problem?
Set error reporting level to maximum:
error_reporting(E_ALL);
ini_set('display_errors',1);
This will help you with understanding the problem
This doesn't answer the question as asked, but you could keep memory usage low by printing the outer JSON array manually:
echo "[";
$first = true;
while ($row = mysql_fetch_assoc($result)) {
if ($first)
$first = false;
else
echo ",";
echo json_encode($row);
}
echo "]\n";
精彩评论