March 13th, 2013, 03:49 AM
how to free up mysql server memory after large batch job
i am in a linux environment using mysql server version with around max 4.5 memory allocated to mysql on a virtual server that has up to 7 gb.
I run a script that does the following every night:
#import data from zip-file to database. Its around 6GB of data. Takes around 25 minutes.
Job1: zcat mysql_backup_file_sql.gz 2> error.log | mysql -u myuser -pmypassword mydatabase -hmy_database_server 2>> error.log
Job2:i have this sql script that i run, with a lot of sql-statements that aggregates,updates and inserts data in new tables.
My problem is that after Job1 has finished, the mysql server does not free up its memory, so i get out of memory error when i try to run job2. How can i free up the mysql server memory after job 1 has run?
For the time beeing i restart the mysql server (sudo /etc/init.d/mysql restart), so i can run job2....but that might not be the preffered way or is it?
help appreciated thanks.
March 13th, 2013, 04:43 PM
That sounds very weird.
You can't and shouldn't free up any memory of a database server after a batch job.
My first bet would be that you are actually allowing MySQL to use more memory than you think (calculating total memory usage is very hard to get right).
And what do you mean by "virtual server that has up to 7 gb"?
Are you using balooning and could potentially have less than 4 gb for this virtual machine with MySQL to use if any other virtual machine needs more RAM?
The possibility of memory leakage in MySQL I place far down since I've never had a problem with it. But that doesn't mean that it isn't a possibility.
Any way you need to verify how much RAM you got on that machine and how much is allocated by MySQL when that error occurs.