How can I parse consumed web-service results into a database table?
I'm using the National Weather Service's REST web service to return forecast data for locations. I can get the data back and display it onscreen by calling an XSLT, and I can use XSLT to transform the returned XML to a file of SQL insert statements that I can then import manually. However, I'm a bit confused as to how I can do this automatically.
Some clarification: I need to have a cron job run on a scheduled basis to pull in data from the web service开发者_运维问答. I then need to somehow parse that data out into database records. This all needs to happen automatically, via the single php file that I'm allowed to call in the cron job.
Can anyone give me an idea of how I'd go about this? Do I need to save the XML response to an actual file on my server, and then transform that file into a sql file, and then somehow (again automatically) run an import on the SQL file? Ideally, I wouldn't have to save anything; I'd just be able to do a direct insert via a database connection in my php file. Would that work if I looped through the response XML using a DOM parser rather than an XSLt file?
I'm open to any alternative; I've never done this before, have no idea of how to go about it, and have been unable to find any kind of articles or tutorials about parsing XML data into a database directly.
You need to parse the xml data instead of using xslt to transform it.
You can use xml_parse_into_struct to turn it into a php array and work from it there.
It is probably easier to use SimpleXml and walk the xml tree though.
Considering you already have an xslt transformation, you can also write out the sql to a file, and pipe it to mysql directly.
exec("echo xml_sql.txt| mysql -uusername -ppassword database")
Good Luck!
精彩评论