开发者

Issue with Unix Bash Script that Reads In Mp3 Urls from .TXT File, then Downloads and Renames Files

I posted earlier needing help with a script to read a list of ".mp3" URLs from a text file ("URLs.txt"), download each file, rename in numerical order (1,2,3...), and then save to "URLs" folder on Desktop:

URLs.txt

http://...34566.mp3

http://...234.mp3

http://...126567.mp3

...becomes...

URLs Desktop folder

1.mp3

2.mp3

3.mp3

Shortly after, I kindly received the following response in Unix bash (for use in Automator):

#!/bin/bash
mkdir -p ~/Desktop/URLs
n=1
while read mp3; do
  curl "$mp3" > ~/Desktop/URLs/$n.mp3
  ((n++))
done < ~/Desktop/URLs.txt

However, although the script runs fine, it will only download files up to the range of "47.mp3" - "49.mp3". The script doesn't stop, it just doesn't download anything beyond this range...

I'm very new to Unix bash, and excuse my ignorance, but is it possible that there's a "50 limit" on script or webpage?开发者_如何学运维

I'm not sure how many URLs my text file has, but it's well over 49.

I've looked through the text file to ensure that all URLs paths are correct and all seems fine...

Also downloaded 47 - 52 manually to make sure that they're actually able to be downloaded — which they are.


No, there is no inherent shell script limit that you are hitting.

Is it possible that the web server you are downloading the MP3s from has a rate limiter which kicks in at 50 downloads in too short a time? If so you will need to slow down your script.

Try this modification and see what happens if you start at the 50th MP3:

#!/bin/bash
mkdir -p ~/Desktop/URLs
n=1
while read mp3; do
  ((n >= 50)) && curl "$mp3" > ~/Desktop/URLs/$n.mp3
  ((n++))
done < ~/Desktop/URLs.txt

If you want to slow it down add a sleep call to the loop.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜