January 17th, 2014, 09:15 AM
GROWING DB, 10.000 > TABLES, BACKUPS
This is my first visit to this forum. Upon registering I hope to find the right solution for a nasty problem I'm dealing with.
The issue is that I am getting stucked making backups of a still growing database with a large number of tables. I use mysqldump for this job. The problem is that together all tables which exceeds the 10.000 consumes more that 300Gb and counting. The data grows during the day by the addition of more tables, numbered with a date and time stamp in its name. Backing up this lasts many many hours with as result that before the backup is finished, a new one is scheduled already in the crontab.
For myself I am thinking to make backups of the separate tables instead of the whole database with the usage of some 'for' or 'while' condition for the tables which reached some kind of ready status or a certain time or maybe I have to think of reproduction of them somewhere else.
Are there any options? The engine used is myisam, v10, dynamic.
January 17th, 2014, 12:24 PM
Make a total backup now (or use the one you have), delete all the unused tables (I hope they're unused...), then make your cronjob (1) backup the database and then (2) delete the new table.
January 17th, 2014, 12:55 PM
Hi Requinix, thanks for replying.
There are no unused tables unfortunately. That is just the problem.
The database is on a slow remote network. With a cpu% of 1,6 on the source and max 4% cpu usage on the destination it takes days.
In that way the database becomes useless as data is changing all the time.
January 17th, 2014, 01:23 PM
You could change your backup scheme to be making an image: create an entire backup server that replicates from the original, and to backup you take the slave down (possibly), backup, bring it online again.
Or do you know that only a certain subset of tables are being updated?
January 17th, 2014, 02:39 PM
I'll study on that. Then I get back with you when I need some more info. ok? Thanks very much in advance.