I have a database application I am working on an I have been trying to figure out the best way of keeping the memory requirements reasonable (lower than 150 Mb total) preferably since the ISP may take exception if I am bogging the server.

I need to use between 2000 to 15000 records to build what I want in html. I know xslt well but it would have to put the whole xml file in memory to use xslt which could be 40+ Mb plus the stylesheets and dtd.

I could do this in Sax but I don't want to create and maintain something like this long term.

Because I am pulling the original data out of a Postgres database, my thought was to pull the records in a dictionary which would likely use memory better and output the pages of output in xml, then transform each page since now I have small xml docs instead of one huge one to begin with. Is this a better idea or will I have similar trouble working with a 15000 record dictionary before I get to the transformation as I expect with a large xml file using xslt?

I read on a post awhile back about someone suggesting a wrapper class for dealing with large dictionaries (around June 7/04). Does anyone have an example and some knowledge on what this will accomplish?

Also does one know of a tool that could measure the difference between RAM requirement for a xslt process vs using a python dictionary to break the information into smaller bites.

Advice appreciated.