January 9th, 2002, 12:34 PM
Wasn't sure which forum this topic belonged in, so here I am.
Looking for ideas on this one.
The small company I work for runs an e-commerce site, driven by Perl and MySQL. Everything is run off a Linux dedicated server that is hosted at a server farm. The problem we have is we don't have a test server. Changes to the perl script and MySQL tables are done live! As you might realize, this can cause some major problems.
We would like to set up some sort of testing platform, with our limited budget and manpower. The two ideas I have come up with are either setting up a test linux server in-house (that would be essentially free) or set aside a certain part of our primary server as a testing area, whether that be a different virtual domain, or a separate hard drive... Not being an absolute Linux guru, I'm affraid that trying to set up a Linux test server in-house that is identical to our primary server (or close to that) will be a huge project, not to mention one more server to maintain. I'm leaning towards just setting up a test area on our current server, but I'm not sure the best way to do this.
The main problem that I see with either of these ideas is, how do you syncronize files between the two? In order to have a 'test area' set up on our current server, I'll need to completely replicate the database, and all web-site files. Once I've made a number of changes in the test area, how do I update the live area accordingly? For example, I add a few products to the test database, then want to put just one of those product up on the live site. Is there an easy way to manage this sycronization (software..)?
All of this makes me wonder how the big guys do it, like Amazon or even this site!
Any thoughts or ideas will be greatly appreciated.
January 9th, 2002, 01:36 PM
It's incredible, every serious being uses three staged development!!
Ok, for budget reason is often
and you are running everything on the production server!
Your customers must be idiots or have never heard about SLA!!!
Put up an inhouse server and use ftp or webDAV and CVS!
January 9th, 2002, 10:26 PM
A limited budget is hardly a barrier to having a development/testing machine.
Usually a very cheap used computer is all you need. Find out what version of Linux is running on your dedicated server, and then get that same version. If you are using standard configurations of Apache, Perl and MySQL, then its very easy, because all Linux distributions I know of include these as choices, even with a basic (beginner) install.
If you really want to do some sophisticated performance benchmarks, then you would want to have a machine that is an exact duplicate of your dedicated server. Otherwise, you should be fine even with a Pentium 166 with 64 MB RAM. If your application runs fine on a slower machine, then it should have no problem on your full server. I developed my first large PHP/MySQL application on a pentium 90 with 16 MB RAM, and had no problem at all.
There are many solutions to synchronization, but mainly you should divide it into two issues: 1. Data synchronization (such as MySQL tables, flat file databases, lists, etc...), and 2. Code synchronization (your Perl code that runs the web application).
1. At the beginning, code synchronization can even be done manually, if you don't have too many files. Just be sure to enforce some method of transferring files back and forth. Eventually it is a good idea to move to CVS, or some other source control application.
The following command is very useful in getting a compressed "snapshot" of a directory:
tar -zpscf my_snapshot.tar.gz directory_name/*
Unpack this complete directory tree anywhere with:
tar -zxvf snapshot.tar.gz (careful, this will overwrite any files of the same name and directory where you unpack)
(I'm sure you know these, but I'm providing them just in case)
2. When it comes to part data synchronization, you have to ask yourself "Do I use any data during development that will be important to transfer to the server?". If so, then your application design should probably be rethought. But let's just say that you want to continually be working with a mirror of your main server's MySQL tables. MySQL now supports replication, which means your development server can have a continuously updated database mirror (remember, unless MySQL is compiled with SSL support, then that data might be vulnerable to sniffers).
Usually you won't need this, though. More likely, the problem you will have is that you have changed the structure of your database on your development server, and you now want to update the main server. In this case, you will need to not only put the new database structure in place, but import the data from the old structure, massaging it as needed to fit the new structure. I usually solve this by doing several test data imports on the dev server, then writing a script that can do the import quickly. Then, when ready, you will have to shut down your server for a couple minutes, run a backup (mysqldump) of your existing database, then rename the old database, create your database with its new structure, and run whatever scripts you need to import the data.
As always, make multiple backup copies every time you do something like this. Also, it's always best to only do a large update right after you have done a full tape backup of your server, so you can roll back if needed.
January 9th, 2002, 10:48 PM
I use an extremely simple setup. We have a test server which is organized in the exact same way as the production server (allthough that's not a must). I do all my development here, and when I think the code is ready for the live server I use a small utility called rsync to sync the two version, typically I do:
rsync -uvre ssh /path/to/test/site/ firstname.lastname@example.org:/path/to/live/site
Then rsync figures out which files have changed, and uses ssh to upload them. It's amazingly simple and works like charm. It especially nice when you need to do some major maintenance and change lots of files but have a slightly different directory structure on the remote server (for example, we have some folders where the clients can upload data). If I was to simply copy the test version over using for example tar, I would have to take care not to remove these extra files, and move them around will I did the untaring etc., very annoying, with rsync I have no worries.
Puritanism: The haunting fear that someone, somewhere may be having fun
January 10th, 2002, 03:00 AM
Not to mention that you can use VMWare or VirtualPC and have a second OS in your PC, just add some memory.