开发者

getting xml file from a website and copying it to a server

My question is very broad since I do not know exactly what I ought to do. It will be my first trial. I hope I can express my need adequately.

I want to read and than copy an xml file from a website and copy it to my amazon cloud server account. I want to write the code on my amazon server on linux platform.

1- I need to check the xml file on the website every day.

2- If there is a change in the xml file, I will get it and copy it to my amazon cloud server. (perhaps I can compare character lengths of today's and previous day's xml file in order to understand if there is a change)

3-I made a research and I found wget command can be used to copy a file.

Could you please give me som开发者_开发知识库e sample codes and guideline?

Many thanks,

I apologize if my question is nonsense or ambiguous .


Yes, you could use the wget or curl commands to download the XML file. You can use diff to compare the new file to the old file. Maybe look into creating a bash shell script to automate these processes, and schedule it to run periodically with cron. I think you could have this run directly on your "cloud server", rather than transfer the XML file there after doing these checks.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜