compress

Mysqldump all databases and gzip

So, quick and simple MySQL backup in a CRON.

#!/bin/bash
_hour=`date '+%H'`;
_user="root";
_pass="thepassword";
_dest="/srv/backups/daily/";
mysqldump -u $_user -p$_pass --all-databases --ignore-table=mysql.event | gzip > $_dest_$_hour.sql.gz

I’ve used this in an hourly CRON so that I always have 24 hours of backups.

Note: The ignore of mysql.event is to stop an annoying new feature of a later version of MySQL that seems to report a notice that its skipped this table. I don’t really need it so I am ignoring it.

Compress files and folders in tar (Linux)

GZIP and TAR are two different utilities bound by the same cause. Compressing files and making file catalogs.

To compress a file, simply gzip it:

$> gzip database_dump.sql

That will compress the database_dump.sql file, and rename it to database_dump.sql.gz. Easy, and to uncompress it:

$> gunzip database_dump.sql.gz

That will restore the uncompressed version and rename to file back to .sql

But, what if you want to do more than one, or a folder. GZIP is not the tool, however it is used.

You need to TAR the files (which creates one file with the specified files in) and then gzip it:

$> tar -cvzpf compressed_file.tgz foldername

That will compress all files in the foldername folder into a file specified, and it will GZIP it for you.

If you want to just compress a few files, then instead of using the foldername have a list of the files separated by space

$> tar -cvzpf compressed_files.tgz file1 file2 file3

Easy, so to unTar them:

$> tar -zxvf compressed_file.tgz

The above will extract the files to the current folder, if you want them somewhere else:

$> tar -C /foldername -zxvf compressed_file.tgz

The above will extract the files to the /foldername folder.

Hope that helps a little.