reading arguments for program from custom file and running that program with this arguments
all!
I need to read arguments from a file 'data' that consists of strings like:
-a -camb="1 0.5 1",diff="1 0 0" -q=5
-a -camb="0 1 0" -p -q -f=10
...
Next, that arguments must be passed to a program ./test within a script:
#!/bin/bash
while read line
do
./test "$line"
done < "./data"
the problem is that "$line" is passed as argv[1] to ./test, and not as a sequence开发者_开发知识库 of argv[1], argv[2], argv[3]
How can I split the string line to several arguments? I.e. the ./test must takes argv[1], argv[2], and so?
Note, that -camb="1 0.5 1",diff="1 0 0" must be as whole argument, argv[2]!
You can use eval for this:
#!/bin/bash
while read line
do
eval "./test $line"
done < "./data"
There's a big warning here, however: eval may do more interpretation of the file contents than you want. For example, if it contains any I/O redirects (e.g. >somefile
), they will be applied. Similarly, $variable
will be substituted, ; somecommand
will be executed as a separate command, etc. Basically, if the contents of the data file aren't clean enough, you can get some unexpected and potentially dangerous results.
The quotes are literal, not syntactic, which means they won't be handled the same way as on the shell. But you can handle them by setting them in an array:
$ params(){
for param in "$@"
do
echo "$param"
done
}
$ while IFS= read -r line
do
declare -a par="($line)"
params "${par[@]}"
echo
done < data
-a
-camb=1 0.5 1,diff=1 0 0
-q=5
-a
-camb=0 1 0
-p
-q
-f=10
PS: Don't use eval
.
精彩评论