Why do Python functions get garbage collected?
I have a C++ library which uses Python call开发者_开发问答backs. The callback, i.e. PyObject*, is stored in an object of class UnaryFunction, and the constructor Py_INCREFs it. The destuctor Py_XDECREFs it. That's the problem. The interpreter segfaults on that DECREF.
My solution is to just not DECREF it, but that seems wrong. What is the proper way to INC/DEC the reference count of a function, and more importantly, why does the interpreter try to GC a function body when there are other live references to it?
Edit: On Linux instead of a segfault I get an assertion fail that says:
python: Objects/funcobject.c:442: func_dealloc: Assertion 'g->gc.gc_refs != (-2)' failed.
A crash does not necessary mean that it is trying to GC an used object. It can also mean that you are calling python code without the interpretor lock.
Calling Py_XDECREF in a destructor leads me to think you have something like this:
void MyCallback(myfunc, myarg)
{
...
PyGILState_STATE gilstate = PyGILState_Ensure();
try {
myfunc(myarg);
} catch (...) {
...
}
PyGILState_Release(gilstate);
// myfunc goes out of scope here --> CRASH because we no longer own the GIL
}
with the simple solution:
...
try {
scopefunc = myfunc;
myfunc = emptyfunc();
scopefunc(myarg);
} ...
It appears the Py_INCREF simply doesn't actually increment the refcount.
精彩评论