开发者

Get list of files via http server using cli (zsh/bash)

Greetings to everyone,

I'm on OSX. I use the terminal a lot as a habit from my Linux old days that I never surpassed. I wanted to download the files listed in this http server: http://files.ubuntu-gr.org/ubuntistas/pdfs/

I select them all with the mouse, put them in a txt files a开发者_C百科nd then gave the following command on the terminal:

for i in `cat ../newfile`; do wget http://files.ubuntu-gr.org/ubuntistas/pdfs/$i;done

I guess it's pretty self explanatory.

I was wondering if there's any easier, better, cooler way to download this "linked" pdf files using wget or curl.

Regards


You can do this with one line of wget as follows:

wget -r -nd -A pdf -I /ubuntistas/pdfs/ http://files.ubuntu-gr.org/ubuntistas/pdfs/

Here's what each parameter means:

  • -r makes wget recursively follow links
  • -nd avoids creating directories so all files are stored in the current directory
  • -A restricts the files saved by type
  • -I restricts by directory (this one is important if you don't want to download the whole internet ;)
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜