So I set out to develop a DB-API compatible layer on top of PL/Python: plpydbapi
CREATE FUNCTION test() RETURNS void LANGUAGE plpythonu AS $$ import plpydbapi dbconn = plpydbapi.connect() cursor = dbconn.cursor() cursor.execute("SELECT ... FROM ...") for row in cursor.fetchall(): plpy.notice("got row %s" % row) dbconn.close() $$;
Granted, it's more verbose than the native PL/Python syntax, so you might not want to use it after all. But it can be helpful if database calls are nested in some other modules, or you just don't want to learn another database access API.
This started out more as an experiment, but it turns out that with the many improvements in PL/Python in PostgreSQL 9.1, it's possible to do this. (Subtransaction control and exception handling were the big issues.) The one gaping hole is that there is apparently no way to get metadata out of a query result. Something to address in PostgreSQL 9.2, perhaps.
Thanks go to the DB-API compliance test suite, which was extremely helpful in making this happen. (Nonetheless, the test suite is quite incomplete in some regards, so treat the result with care anyway.)
Another thing that I found neat about this project is that I managed to get the unit tests based on Python's unittest module to run in the PL/Python environment inside the PostgreSQL server. That's the power of unittest2.