开发者

How do I transfer wget output to a file or DB?

I'm trying to use a small script to download a field from multiple pages. For one thing, I'm only able to get it from one page..., but the real problem I'm having is that I don't know how to hand the output off to a database table? How can I take the output from curl/lynx|grep (which is going to be all the list items) and move it, list item by list item, to a table in my DB or to a CSV where it will be ready for import to the DB?

#!/bin/bash

lynx --source "http://www.thewebsite.com"|cut -d\" -f8|grep "<li>"

The database I would connect to would be a MySQL database. We could call the dummy table "listTable". Please, try to stick to bash? I'm not allowed to compile on 开发者_如何学JAVAthe server I'm using, and I can't seem to get curl to work with PHP. Anyway, I'm thinking I need to make a variable and then systematically pass the contents of the variable to the database, right?


Use something like awk, sed or perl to create INSERT statements, then pipe that to your sql client (psql or mysql).


Just write a Python script which reads everything from the stdin an puts it into the database and do something like:

curl http://www.google.com | ./put_to_db.py

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜