Confused by python package scoping
I'm new to python, and confused by certain behavior.
I have a directory called d
. In this directory, I have two files:
__init__.py
:
from d import *
and
d.py
:
var = None
def printVar():
global var
print "from d: var=%s" % `var`
From the directory above d
, I get this inte开发者_运维知识库raction within python:
>>> import d
>>> d.var = 5
>>> d.printVar()
from d: var=None
Why is var
not changed from the perspective of d.py
?
My real goal is to accomplish the following:
- Keep
__init__.py
small - Be able to change a
d.py
-global variable
If it makes a difference, I have multiple files in my package directory, and it would be sub-optimal to combine these into a single file.
What is an acceptable way to do that?
When you say:
import d
you're importing the package, not the module.
Just import the module d within the package:
>>> from d import d
>>> d.var = 5
>>> d.printVar()
from d: var=5
I think the actual name of the global var
should be d.d.var
(because it is in module d
in package d
)
So you could
1) Just refer to it as d.d.var
when you set it
2) Make a setter in d.py
Unfortunately these probably won't work:
a) Copy it into d.var
and try and set it there (what you have in your question) -- it's a different variable
a) import it back like from __init__ import var
(recursive imports)
from X import *
Copies all of the names from X into the local module. As a result, modifying the local module won't modify the original.
As for what you actually want to do, you could use some python hackery and replace the module object with one of your own which overloads the operation of assigning an attribute. But don't do that. That'll require a lot of code and just make you look odd.
I suggest having a function that the foo's client code can call.
精彩评论