October 6th, 2003, 12:53 PM
mod_python & MySQLdb question
Consider the following code:
When you access this file through a browser with /process?value=something appended, mod_python imports it, executes the process function with value as a parameter and displays whatever you returned. My question is, would it be a good idea to move the part where I connect to the database to module(global) level ? Then it wouldn't be neccesary to reconnect every time the process function is called.
def process(req, value):
db = MySQLdb.connect(host="localhost", user="foo", passwd="bar", db="pythondb")
cursor = db.cursor()
... do something with database connection ...
Last edited by Cuboidz; October 6th, 2003 at 12:57 PM.
October 6th, 2003, 04:42 PM
It really depends on a few things, how many times are you plan on calling this? If it's more than once then you may want to alter your function to handle just the query and have the connection elsewhere, or pass the connection to the function as an preamiter (i havn't tested any of these idea's so they might not work quite like this)
October 6th, 2003, 08:18 PM
Sounds like you need a pconnect() equivalent for Python.
you could build a routine that checks the validity of 'db' or 'cursor' and decides if a new connection is needed.
to answer the question - Yes, I think it is always better to hold one connection open than to re-connect for every single query (or even group of queries).
netytan's idea of connecting external to the routine and passing the handle is a good solution aswell.
October 7th, 2003, 01:36 PM
Thank you both for answering. I guess I asked this question because I was afraid that when you keep a connection open per user viewing your page you basically have fewer free connections available at any given time. But maybe it doesn't work this way, I don't know.