How to extract images from a web page, preserving their display size? [closed]
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this questionI need to collect images within a large CMS in order to group them within single CSS sprite. But many of them are resized on screen within CMS. Saving a complete web page always gives images with their original dimensions. So, CSS sprite generators render them with their original size. And CSS sprite technique is based on using HTML elements with background images, which can be displayed only in original size.
Does anybody know an application or script that can grab images from a web page, but to save them with their display dimensions? (Which means that it should also save various instances of same image as various files)
I know this is an older question - but it came up in my searching.
Since webpages just give the URL's and Display dimensions up to the Browser, it is the browsers job to display the proper size... however the browser also takes into consideration any zoom settings, or other considerations, etc that you may have set. In addition some elements may have relative sizes that change depending on your browsers viewport, desktop vs mobile as an example. So getting an exact pixel for pixel download may give you with false positive results that are tailored just to the downloading software or browser.
Your best bet (although not a prewritten piece of software like you asked) might be to download a list of all image resources (use a tool like ones others have mentioned - I like this one: http://www.webalyzing.com/free-seo-tools/image-extraction.php)
If you absolutely had to have the exact pixel dimensions as given in the HTML, then you may need to write a little DOM parsing script that iterates through all of the IMG elements. Here is a simple (untested) example in PHP.
$dom = new DomDocument();
$dom->loadHTMLFile('http://some-url');
$images = $dom->getElementsByTagName('img');
foreach($images as $image){
$src = $item->getAttribute("src");
// Get other attributes as needed, styles, width, etc...
}
Etc...
I think you can only do it manually. there is a firefox extention called "pixlr grabber" that can help you.
https://addons.mozilla.org/en-US/firefox/addon/pixlr-grabber/
Sample image:
Even this small tool extracts images from webpage http://www.prowebguru.com/2011/12/free-online-image-extractor/
Even you can try this http://www.wikihow.com/Download-All-Images-on-a-Web-Page-at-Once
精彩评论