开发者

Using phpcURL (libcurl) how can i download ONLY the images from a URL

I want to use this to measure how long it takes to download images, then i can do the same measurement with other content types.

i found a relevant examp开发者_Python百科le: and sharing it.

# Parse the image tags 
$img_tag_array = parse_array($web_page['FILE'], ""); 
    if(count($img_tag_array)==0)
{echo "No images found at $target\n"; exit; }

The reson for DIY is to automate, this is part of a bigger framework.


Why don't you make it simple, Download Google Chrome and use the Web Developer tools to see detailed statistics on the time it takes to download the Image? It seems you are over complicating this, unless I am misunderstanding what exactly it is you want to do?


Found the function that does what i wanted, so i am sharing:


function download_images_for_page($target)
{
    echo "target = $target\n";

    # Download the web page
    $web_page = http_get($target, $referer="");

    # Update the target in case there was a redirection
    $target = $web_page['STATUS']['url'];

    # Strip file name off target for use as page base
    $page_base=get_base_page_address($target);

    # Identify the directory where iamges are to be saved
    $save_image_directory = "saved_images_".str_replace("http://", "", $page_base);

    # Parse the image tags
    $img_tag_array = parse_array($web_page['FILE'], "");

    if(count($img_tag_array)==0)
    {
        echo "No images found at $target\n";
        exit;
    }

    # Echo the image source attribute from each image tag
    for($xx=0; $xx
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜