complex bash functions: correct usage?
Soo, coming to a point: I'm having a cruel bash script, looking like this:
#!/bin/bash
dosomething () {
# code
echo $1 >> ~/test.txt # For debugging
$1 & # ← Important line
# more code
}
main () {
# other code
dosomething "/usr/bin/rsync $rsync_options" # ← Call it "do_1"
dosomething "/usr/bin/find $findme_path -iname \"*.gpx\" -print0 | xargs -0 $other_command" # ← Call it "do_2"
# another code
}
( main | (zenity --progress $zenity_options || $still_another_command ) &
What (should) happen: The output of main is piped to zenity (progress bar). main does not really execute any command, but calls dosomething with a parameter, which contains the command to be executed. dosomething executes the command.
What really happens: The "echo" part of dosomething works like expected. After the script has been executed, the commands in do_1 and do_2 correctly appear in ~/test.txt. (If I copy&pasted the contents of ~/test.txt in my terminal, every command was executed with the expec开发者_运维知识库ted results.)
The "Important line" of "do_1" is executed with the expected results. But the "Important line" of "do_2" seems to have no effect. At least, I can see no effects of $other_command after executing the script.I hope you can at least understand what I mean. It would be very kind of you if you gave me a hint what goes wrong in here.
Short answer: see BashFAQ #50.
Long answer: When bash parses a line, it parses quote marks and command delimiters (|
etc) before doing variable substitution; as a result, then it runs $1 &
in the function, the quote marks and pipe in $1's value never get parsed, they're just passed on to the command (usr/bin/find in this case) as part of an argument. Net result: it's actually running the equivalent of /usr/bin/find $findme_path -iname '"*.gpx"' -print0 '|' xargs -0 $other_command"
.
Normally, in cases like this, I'd recommend passing the command as a series of words (i.e. have the function run "$@" &
, and call it as dosomething /usr/bin/find $findme_path -iname "*.gpx" -print0
but even that won't handle the pipe in the command -- it'll still be treated as just another argument.
One possibility is to use eval
. This should be avoided if at all possible, because eval
is a good way to create scripting bugs -- massive bugs, subtle bugs, incomprehensible bugs, security bugs... It adds an additional layer of parsing to everything in the command line, which means that, for example, if you're trying to operate on a file named Fred's file.txt
it'll take that apostrophe as a quote mark and get very confused. Operating on files that happen to contain back-quotes is too horrible to contemplate. Basically, it's bad news.
After a quick look at the actual script, what I'd recommend is a mix of strategies: hide as much as possible of the command complexity in shell functions, so you aren't trying to pass pipes and redirects into the dosomething
function, then use the series-of-words approach I mentioned earlier. For the script in the question, I'd do:
#!/bin/bash
dosomething () {
# code
printf "%q " "$@" >> ~/test.txt # this gives a much better idea what's being done than echo $1 would
"$@" &
# more code
}
# hide the pipeline in a shell function
find_and_do_something () {
/usr/bin/find "$1" -iname "*.gpx" -print0 | xargs -0 $other_command
}
main () {
# other code
dosomething /usr/bin/rsync $rsync_options
dosomething find_and_do_something "$findme_path"
# another code
}
( main | (zenity --progress $zenity_options || $still_another_command ) &
That means your log file won't have the details of the pipe that's being executed, just find_and_do_something /whatevers/in/findme_path
, but at least it'll work.
Where do you define $other_command
? Also, note that you're missing an ending parenthesis at the end (main
opens a parenthesis that is not closed).
精彩评论