November 9th, 2004, 08:06 AM
Multithreading with Python and ODBC
I'm designing (well debugging to be specific) a server program that parses through files and either adds or alters a database to include the file information. For speed and maintenance issues this is multithreaded.
We have the same instance of the ODBC present throughout all threads (because we had trouble creating seperate ODBC instances for the different threads)
Now the problem occurred when attempting to write to the database. Each thread had it's own ODBC cursor, but when two (or more) threads attempted to write to the database at the same time python crashed.
Now there's three ways I can think of to avoid this, and an option incase there's a better way.
1.) Have a persistant 'database writer' thread and we pass / pipe messages to that process to write the database.
2.) Have a non-persistant 'database writer' thread and we open it up to write to the database, this would have a maximum thread count of 1 to avoid colisions
3.) We have a protected section around all database writes so that only one item can write at a time.
4.) Do something else moron see my post for how to really do it
November 9th, 2004, 05:48 PM
I would have a single database thread, and a queue of objects to write to the database. Any thread can push objects to the queue, which will block if the queue is full. The database thread takes objects off the queue and writes them to the database, and will wait if the queue is empty.
See the Queue module in the Python library for full details.
Dave - The Developers' Coach
November 12th, 2004, 07:57 AM
Thanks for the responses...
I ended up going with a protected section, since the code was already developed and tearing it all up would have been more of a hassle than a savings.
In the future though I'll probablly stick with a persistant database writer thread that the other thread pass messages to.