#1
  1. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Apr 2010
    Posts
    209
    Rep Power
    83

    Need to delete files after x days via backup bash.


    I'm very new to bash, and i have a basic script that is running on my server via cron job. It runs a backup every 15min of mysql db's.

    However, as you can tell - this gets rather heavy. So i just need to add some code for it to DELETE anything in that folder that exceeds CURRENT DATE -60 days. So anything that is older than 60 days will be deleted.

    Is that too heavy to be in the same script as a backup?
    I've tried googling for some code snippets, but no avail. is this something simple?

    here is what i currently have, i just need to add that "delete older than 60 days snippet"

    Code:
    #!/bin/bash
    
    MYSQL_USERNAME='1337hov'
    MYSQL_PASSWORD='password'
    MYSQL_DATABASE='databse'
    BACKUP_DIR='/home/trsmedia/mysql-backup'
    
    if [ -z $MYSQL_USERNAME ]
    then
    	echo "[-] MySQL username and password required for automatic backups."
    else
    	/usr/bin/mysqldump -u$MYSQL_USERNAME -p$MYSQL_PASSWORD $MYSQL_DATABASE | gzip > $BACKUP_DIR/`date +%d%m%Y_%H%M`.sql.gz
    fi
  2. #2
  3. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Apr 2010
    Posts
    209
    Rep Power
    83
    The closest thing that i've found it this, but i don't know how this will find the proper files? based on created date? or filename containing the date?

    Code:
    find /path/to/files* -mtime +30 -exec rm {} \;
  4. #3
  5. No Profile Picture
    Contributing User
    Devshed Regular (2000 - 2499 posts)

    Join Date
    Mar 2006
    Posts
    2,449
    Rep Power
    1751
    That's the sort of thing you'll need - but I do suggest changing the rm {} \; to ls -ld {} \; until you are happy that it will only pick on the files you want.

    Your backups follow a naming standard (albeit with the 'wildcard' of the date) in a fixed directory, so:

    Code:
    find /home/trsmedia/mysql-backup -type f -name \*.sql.gz -mtime +59 -exec ls -ld {} \;
    should list all backups that are over 59 days old.
    The moon on the one hand, the dawn on the other:
    The moon is my sister, the dawn is my brother.
    The moon on my left and the dawn on my right.
    My brother, good morning: my sister, good night.
    -- Hilaire Belloc
  6. #4
  7. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Apr 2010
    Posts
    209
    Rep Power
    83
    Gotcha, that looks great. How is it going to list if i am just running it as a cron job? in some sort of command/terminal?

    Sorry again, very new to this.
  8. #5
  9. No Profile Picture
    Contributing User
    Devshed Regular (2000 - 2499 posts)

    Join Date
    Mar 2006
    Posts
    2,449
    Rep Power
    1751
    Run it by hand first: ensure that there are no bugs, and that what you get is what you want. Once you have done that swap the ls -ld for your rm and, once again, run it by hand. Then make sure you have a valid backup in case it all drops in the pot ...
    Once you are happy slap it in cron.

    If you wanted the list via cron then either redirect the output to a file or just allow cron to use it's default of mailing the running user with any output from the command run.
    The moon on the one hand, the dawn on the other:
    The moon is my sister, the dawn is my brother.
    The moon on my left and the dawn on my right.
    My brother, good morning: my sister, good night.
    -- Hilaire Belloc
  10. #6
  11. No Profile Picture
    Contributing User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Apr 2010
    Posts
    209
    Rep Power
    83
    Ah ok understood, Simon you've been a fantastic help. i'll do this the second i get home. I appreciate your time my man.

    gotta DL more ebooks.

IMN logo majestic logo threadwatch logo seochat tools logo