开发者

Read Database into XML and Search It

I have a database table with book details and would like users to search it. However, I was told I could read all the data on page load into XML and allow the search to be carried out on the XML instead of repeated round trip to the database. My question is, is this something possible?

开发者_如何学C

My table structure is like this:

===============================================================
|  id    | name     | pages  |  category  | date | publisher  |
===============================================================

Has anyone done this sort of thing before?

Any suggestion would be appreciated.


This is only example - fake code - you must test and rebuild this

You must create xml files form Your table.

$dom = new DOMDocument('1.0', 'utf-8');

    function arr2xml($arr, $el=null)
        {
                    global $dom;  
            foreach ($arr as $key=>$val){           
                $tag_el=$dom->createElement((is_numeric($key) ? 'row' : $key));
                (is_array($val) ? arr2xml($val, $tag_el) : $tag_el->appendChild($dom->createCDATASection($val)));
                (empty($el)) ? $dom->appendChild($tag_el) : $el->appendChild($tag_el);
            }
        }

$result = mysql_query('select * from your table', $mysql_link);
$arr = array();
if ($result !== FALSS && mysql_num_rows($result))
{
  while($row = mysql_fetch_array($result))
   $arr[] = $row;
}
arr2xml($arr);
file_put_contents('tablexml.xml', $dom->saveXml());

///// FOR SEARCH BY category

$dom = new DOMDocument;
$dom->loadXML(file_get_contents('tablexml.xml'));
$xpath = new DOMXPath($dom);
$res = $xpath->query('//row[cotanins(category, $search_value)]');
if ($res->length)
{
 foreach ($res as $el)
 {
   ////////////////
 }
}


xpathis what you'd use to do the actual searching. It is a query language for XML.

Assuming your database file in xml would resemble the following:

<database>
    <book>
       <id>000000</id>
       <name>A Book</name>
       <pages>23</pages>
       <category>Tech</category>
       <date>2011-08-31</date>
       <publisher>Publisher</publisher>
    </book>
    ...
</database>

An xpath query to search for books with more than 20 pages would be

//book[pages > 20]


XSLT may suit your needs if you look for a client-side solution.

Check the demo


You seem to be suggesting to dump the whole database table into an XML file, sure that can be done but it won't make your searches any faster. To accelerate your database queries you would normally use indexing on the data fields used in the queries.

Another technique (as also suggested by the comment by DrColossos), if your searches are text intensive - maybe in the title (name?) field, then you can use a full text search engine (such as Solr or Sphinx) in conjunction with the database engine. This would provide you with per word indexes within the text fields and thus allow faster and more advanced searches.

Regards.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜