November 14th, 2000, 10:38 AM
We have a site that has long used Perl, and had a database, but didn't use it to its full potential. This summer, we converted to a PHP/MySQL based site.
We have approx. 20,000 users/day ... and, on various pages, we pull in articles and/or raw statistical data out of the db. There's approx. 800 and growing records in the articles DB, and other tables have a set amount of data, usually around 3,000 records in one, 800 in another, 2,000 in another, from which at times we do a number of joins, none too complicated.
Oh, and we have, for argument's sake, about 512MB of RAM.
Some of these dynamically generated pages only change, say, twice a week.
The question is ... are we better off creating static pages and serving them that way? Or are we fine, given our user load, to serve dynamic pages?
I'm pretty new to this - and maybe this is an age-old question - I don't know. But I'm wondering. I find dynamic content much easier to deal with - you don't have to worry about storing a number of files on the local disk, and the code is easier.
Others, however, are worried about server load.
When we create static pages, we check the database to see the latest update, and check it against the timestamp of the most recently-created static file. If there's new data, the static file gets rebuilt from the database data ... If not, the current static page gets served.
I'd contend that this process is actually longer and more server-intensive than just serving the dynamic page every time.
On the other hand - what if you use a perl cron job to create the static page at a certain time every day. Is that better/worse/doesn't matter?
Looking for opinions.