开发者

Bash command to copy images from remote url

I'm using mac's terminal.

I want to copy images from remote url: http://media.pragprog.co开发者_运维问答m/titles/rails4/code/depot_b/public/images/ to a local directory.

What's the command to do that?

Tnx,


You can use curl

curl -O "http://media.pragprog.com/titles/rails4/code/depot_b/public/images/*.jpg"

for example.


alternatively you may want just all the images, from a website. wget can do this with a recursive option such as:

$ wget -r -A=jpeg,jpg,bmp,png,gif,tiff,xpm,ico http://www.website.com/

This should only download the comma delimited extensions recursively starting at the site index. This works like a web-spider so if its not referenced anywhere on the site it will miss the image.


wget will work, assuming the server has directory listing:

wget -m http://media.pragprog.com/titles/rails4/code/depot_b/public/images


You can do this with Wget or cURL. If I recall correctly, neither come out-of-the-box w/ OS X, so you may need to install them with MacPorts or something similar.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜