Professionals Can Do It Bash + Escape
I spent the whole night trying to get the proccess done but all my attempts ends with failure.
I write a very simple script to clear what I'm trying to do please copy it and try to power it up.
#!/bin/bash
set -x
urls='http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'
#urls="http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3"
for letsgo in `curl -OLJg "'${urls}'"` ; do
echo "GOT TRIED OF TRYING"
done
# for letsgo in `curl -OLJg $urls` ; do
#echo "GOT TRIED OF TRYING"
# done
The result which I got after starting it up
First Loop Way:-
./ap2.sh
+ urls='http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'
++ curl -OLJg ''\''http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'\'''
curl: (1) Protocol 'http not supported or disabled in libcurl
+ for letsgo in '`curl -OLJg "'\''${urls}'\''"`'
+ echo 'GOT TRIED OF TRYING'
GOT TRIED OF TRYING
Second Loop
./ap2.sh
+ urls='http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'
++ curl -OLJg http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine '(Original' 'Mix).mp3'
curl: option -: is unknown
curl: try 'curl --help' or 'curl --manual' for more information
The problem is something I don't know where is escaping the url without my permission and get the things not to work probably.
Update
I get rid of it by using
for letsgo in `curl -OLJg "${urls}"` ; do
echo "Working Fine But We Still Have Problem When We Are Using More Than 1 URL"
done
The problem when the script have more than one more url each of them must be in quotes (Only for my case) to get the curl working probably. I can do it manually in linux console without any problem but when it comes to using a BASH script the result of these script will be
#!/bin/bash
set -x
urls="'http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3' -OLJg 'http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1' -OLJg "
for letsgo in `curl -OLJg "${urls}"` ; do
echo "Working Fine But We Still Have Problem When We Are Using More Than 1 URL"
done
Results:-
+ urls=''\''http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'\'' -OLJg '\''http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1'\'' '
++ curl -OLJg ''\''http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'\'' -OLJg '\''http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1'\'' '
curl: (1) Protocol 'http not supported or disabled in libcurl
+ for letsgo in '`curl -OLJg "${urls}"`'
+ echo 'Working Fine But We Still Have Problem When We Are Using More Than 1 URL'
Working Fine But We Still Have Problem When We Are Using More Than 1 URL
I just want it to work normally the same way as I enter it on linux console without the interrupting which made by bash by escapin开发者_开发技巧g strings. Like this way
curl -OLJg 'http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3' -OLJg 'http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1' -OLJ
You should remove the single quotes in the argument of curl
- the way you have written it they become a part of the URL.
BASH FAQ entry #50: "I'm trying to put a command in a variable, but the complex cases always fail!"
Have you ever seen a blank in the browser adressbar? They must be converted to %A20 (corrected by Lucas' comment, thanks) %20 and, maybe, similar special chars, too.
wellurl=$(echo $urls | sed 's/ /%20/g')
I don't know curl - it's something similar to wget, isn't it?
wget -np $wellurl
2011-04-10 16:55:28 (17,2 MB/s) - »An-Beat - Mentally Insine (Original Mix).mp3« gespeichert [191]
worked for me.
update:
To get multiple urls from a script, use an array:
#!/bin/bash
#
declare -a urls
urls=('http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3' 'http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1')
for i in $(seq 0 ${#urls[@]} )
do
wellurl=$(echo ${urls[i]} | sed 's/ /%20/g')
# echo "$wellurl"
curl -OLJg "$wellurl"
done
- ${#urls[@]} returns the number of elements in the array
- don't put options into the array
- use smaller urls in your next questions, please :)
try this
urls="http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3"
curl -OLJg "${urls}" | while read results
do
...
done
The use of a variable called urls suggests that there will be more than one URL in there. If so, you might consider BASH arrays. Also, the echo message "GOT TIRED OF WAITING" suggests that the curl might fail. If so, you might consider checking for the error more explicitly.
Check out and try running the following:
set -x
list_of_urls=('http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3')
for url in "${list_of_urls[@]}"
do
curl -s -OLJg "${url}"
if [ $? -gt 0 ]; then
echo "$url is a PROBLEM! (return code: $?)"
fi
done
One thing I noticed when running this: the server "succeeds" (in other words, $? is equal to 0), but curl returns a file called error.html. This could be another error condition that you trap for. Good luck!
curl -K doesn't help you? (you can put the urls in file exactly how you seen them in navigation bar) http://curl.haxx.se/docs/manpage.html
精彩评论