Can I take a snapshot of a website from the command line? [closed]
Want to improve this question? Update th开发者_如何学编程e question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this questionI am writing documentation for a website database interface. Is it possible to write a script that, when executed, takes the most recent snapshot of the website and saves it as an image file?
note: I am using the tags ruby, bash, and python since those are the languages used in our group (along with R and FORTRAN; for sport, I'll set a 100 point bounty If anyone can post a FORTRAN solution).
update: as @birryree notes, a previous question provides the answer. Considering the FORTRAN offer is just for sport, perhaps this is a duplicate?
WkHTMLtoPDF has binaries which also convert web pages to images using webkit's renderer.
You might find this useful -- an interesting Python project which processes a queue of URLs and exports the browser screenshots to your server.
Would it be okay to take a snapshot of a web page and save it as an HTML file with images? Because that's trivial.
wget -m -p http://www.example.com/example.html
This can be done with selenium. First set up seleniumrc. Then using any of the language apis, use the captureScreenshot() method. Here is an example in java:
http://www.cloudtesting.com/blog/2009/06/24/capturing-screen-shots-of-browsers-with-selenium-and-cloud-testing-part-1/
Note that selenium will need a display to render to, but you can also use a virtual frame buffer like Xvfb on Linux (probably installed by default)
I would also recommend the seleniumIDE firefox plugin as it can basically write the code for you in any language.
CutyCapt (command-line) does a fine job.
CutyCapt --url=http://www.example.org --out=example.png
精彩评论