开发者

Zend Framework : Problem with a Big database

I am working on a project concerning a personalised search engine. I came upon Zend Framework when i was searching for a way to optimize the time of search results.

I am dealing with a database of 5,000,000 rows and 3 fields and i am trying to index it with.

Here's what i do.: After opening an index with Zend_Search_Lucene::open(); I query the database[correctly], then put the results into $result1, fetch the query results and handle them with the following loop:

    while( ($row1 = mysql_fetch_array($result1,MYSQL_NUM)) ) {
                $doc = new Zend_Search_Lucene_Document();           
                $doc->addField(Zend_Search_Lucene_Field::UnIndexed('catid',$row1[0]));
                $doc->addField(Zend_Search_Lucene_Field::Text('topic',$row1[1]));
                $doc->addField(Zend_Search_Lucene_Field::Text('title',$row1[2]));

             开发者_高级运维   $index->addDocument($doc);

    }
$index->optimize();
$index->commit();   

The Problem is that i never get more than 3,000 addDocuments to work and every time the programm 'stalls' at $index->addDocument($doc); Never reaches more than 5,000 documents ,let alone commiting the index. Any thoughts?


Moving comments to answers

@Faidon Passias Are you running from command line ?

If so the max_execution_time by default will be 0 .

Else you may want to set

php.net/manual/en/function.set-time-limit.php or

php.net/manual/en/info.configuration.php#ini.max-execution-time .

I guess the script is getting stopped after certain time.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜