I agree that passing function parameters through globals is not the best solution
It works in a following way - executing custom code (in our case Python function invocation) in Python is made with
PyEval_EvalCode. As an input to this C function you specify dictionary of globals that would be available to this code. The structure PLyProcedure stores "PyObject *globals;", which is the dictionary of globals for specific function. So SPI works pretty fine, as each function has a separate dictionary of globals and they don't conflict with each other
One scenario when the problem occurs, is when you are calling the same set-returning function in a single query twice. This way they share the same "globals" which is not a bad thing, but when one function finishes execution and deallocates input parameter's global, the second will fail trying to do the same. I included the fix for this problem in my patch
The second scenario when the problem occurs is when you want to call the same PL/Python function in recursion. For example, this code will not work:
create or replace function test(a int) returns int as $BODY$
r = plpy.execute("SELECT test(%d) as a" % (a-1))[0]['a']
$BODY$ language plpythonu;
select test(10);
The function "test" has a single PLyProcedure object allocated to handle it, thus it has a single "globals" dictionary. When internal function call finishes, it removes the key "a" from the dictionary, and the outer function fails with "NameError: global name 'a' is not defined" when it tries to execute "return a + r"
But the second issue is a separate story and I think it is worth a separate patch