February 27th, 2013, 11:03 PM
Best method for remote file read/write
- Two dedicated servers
- Server A: Generates XML files, at a high CPU load for approximately 5 minutes out of every 15 as it generates these files. It also makes database updates to Server B during this process, but this is not relevant to the question.
- Server B: A "Web Server". Does little or no processing, primarily reading from databases and compiling PHP pages at user request
- XML files are approx 500kb each
- While processing is underway, an XML file is stored approximately every 5 seconds
- Up to 5,000 of these files will be created daily
- Files will remain on the server for approximately a week and then be deleted by a cleanup CRON job.
- The best method for the XML files created on Server A to be delivered to users on request in as timely a manner as possible
- Server A stores XML locally, Server B loads them remotely on demand from users
- Server A outputs the files remotely to Server B (FTP? FWRITE?), which can then serve them on request
From what I understand, option 2 would be the best solution for the purpose of a user requesting a file... however my fear is that the large number of writes in a short period of time make Server A grind to a halt on a regular basis?
Is that fear justified??
February 28th, 2013, 01:56 AM
How about making a network share on one of the servers then mounting it on the other? Keeping it on A and mounting on B is probably better since B does less work and less often.
February 28th, 2013, 02:36 AM
Thanks for the suggestion.
Originally Posted by requinix
Although this provides a nice method of addressing the files and abstracts it nicely from PHP, and makes sure that "B" is not interrupted at any time.....does it do anything to help "B" read the file more quickly from "A", even when "A" is at high CPU load?
February 28th, 2013, 02:48 AM
I can't imagine sharing taking anything more than a negligible amount of CPU time.