Emulating Bash 'source' in Python
I have a script that looks something like this:
export foo=/tmp/foo
export bar=/tmp/bar
Every time I build I run 'source init_env' (where init_env is the above script) to set up some variables.
To accomplish the same in Python I had this code running,
reg = re.compile('export (?P<name>\w+)(\=(?P<value>.+))*')
for line in open(file):
m = reg.match(line)
if m:
name = m.group('name')
value = ''
if m.group('value'):
value = m.group('value')
os.putenv(name, value)
But then someone decided it would be nice to add a line like the following to the init_env
file:
export PATH="/foo/bar:/bar/foo:$PATH"
Obviously my Python script fell apart. I could modify the Python script to handle this line, but then it'll just break later on when someone comes up with a new feature to use in the init_env
file.
The question is if there is an easy way to run a Bash command an开发者_StackOverflowd let it modify my os.environ
?
The problem with your approach is that you are trying to interpret bash scripts. First you just try to interpret the export statement. Then you notice people are using variable expansion. Later people will put conditionals in their files, or process substitutions. In the end you will have a full blown bash script interpreter with a gazillion bugs. Don't do that.
Let Bash interpret the file for you and then collect the results.
You can do it like this:
#! /usr/bin/env python
import os
import pprint
import shlex
import subprocess
command = shlex.split("env -i bash -c 'source init_env && env'")
proc = subprocess.Popen(command, stdout = subprocess.PIPE)
for line in proc.stdout:
(key, _, value) = line.partition("=")
os.environ[key] = value
proc.communicate()
pprint.pprint(dict(os.environ))
Make sure that you handle errors in case bash fails to source init_env
, or bash itself fails to execute, or subprocess fails to execute bash, or any other errors.
the env -i
at the beginning of the command line creates a clean environment. that means you will only get the environment variables from init_env
. if you want the inherited system environment then omit env -i
.
Read the documentation on subprocess for more details.
Note: this will only capture variables set with the export
statement, as env
only prints exported variables.
Enjoy.
Note that the Python documentation says that if you want to manipulate the environment you should manipulate os.environ
directly instead of using os.putenv()
. I consider that a bug, but I digress.
Using pickle:
import os, pickle
# For clarity, I moved this string out of the command
source = 'source init_env'
dump = '/usr/bin/python -c "import os,pickle;print pickle.dumps(os.environ)"'
penv = os.popen('%s && %s' %(source,dump))
env = pickle.loads(penv.read())
os.environ = env
Updated:
This uses json, subprocess, and explicitly uses /bin/bash (for ubuntu support):
import os, subprocess as sp, json
source = 'source init_env'
dump = '/usr/bin/python -c "import os, json;print json.dumps(dict(os.environ))"'
pipe = sp.Popen(['/bin/bash', '-c', '%s && %s' %(source,dump)], stdout=sp.PIPE)
env = json.loads(pipe.stdout.read())
os.environ = env
Rather than having your Python script source the bash script, it would be simpler and more elegant to have a wrapper script source init_env
and then run your Python script with the modified environment.
#!/bin/bash
source init_env
/run/python/script.py
Updated @lesmana's answer for Python 3. Notice the use of env -i
which prevents extraneous environment variables from being set/reset (potentially incorrectly given the lack of handling for multiline env variables).
import os, subprocess
if os.path.isfile("init_env"):
command = 'env -i sh -c "source init_env && env"'
for line in subprocess.getoutput(command).split("\n"):
key, value = line.split("=")
os.environ[key]= value
Example wrapping @Brian's excellent answer in a function:
import json
import subprocess
# returns a dictionary of the environment variables resulting from sourcing a file
def env_from_sourcing(file_to_source_path, include_unexported_variables=False):
source = '%ssource %s' % ("set -a && " if include_unexported_variables else "", file_to_source_path)
dump = '/usr/bin/python -c "import os, json; print json.dumps(dict(os.environ))"'
pipe = subprocess.Popen(['/bin/bash', '-c', '%s && %s' % (source, dump)], stdout=subprocess.PIPE)
return json.loads(pipe.stdout.read())
I'm using this utility function to read aws credentials and docker .env files with include_unexported_variables=True
.
Best workaround I found is like this :
- Write a wrapper bash script that calls your python script
- In that bash script you can source or call that script after sourcing your current terminal
精彩评论