How to Backup MySQL Database

  • by

Backup MySQL Database

MySQL provides mysqldump utility to let us dump SQL statements that can recreate the whole database. We can output the SQL statements to a plain text file, furthermore, a compressed file, or we can pipe the output to feed any other live database. Here in this post, we introduce some use cases for different scenarios to backup MySQL database.

  1. Backup Data Only
  2. Backup Data and Routines
  3. Backup Data and Routines with Compression
  4. Backup Data and Routines with Highest Degree Compression
  5. Backup or Migrate Database to a Local One
  6. Backup or Migrate Database to a Remote One
  7. More Options
    • Exclude Some Tables
    • Add a timer
    • Put Jobs to Background

    Backup Data Only

    We'd like to backup only the data of database to a SQL file named example_db_data.sql.

    [root@test ~]# mysqldump -u root -p'12345678' example_db > example_db_data.sql
    mysqldump: [Warning] Using a password on the command line interface can be insecure.
    [root@test ~]# ll
    ...
    -rw-r--r--. 1 root root 1713574900 Aug 21 15:40 example_db_data.sql

    As you can see, the size of SQL file is about 1.59 GB. Later on, you have to know how to import this SQL file into a database.

    Backup Data and Routines

    We call all kinds of stored procedures as routines in MySQL, which include procedures, functions, views and events. With -R, we can add routines to the backup file.

    [root@test ~]# mysqldump -u root -p'12345678' example_db -R > example_db_data_routines.sql
    mysqldump: [Warning] Using a password on the command line interface can be insecure.
    [root@test ~]# ll
    ...
    -rw-r--r--. 1 root root 1713643597 Aug 21 15:41 example_db_data_routines.sql

    The file size is a little larger than the first SQL file, but not much.

    Backup Data and Routines with Compression

    If the file size is too big to be stored, we can pipe the output stream to gzip in order to compress the file.

    [root@test ~]# mysqldump -u root -p'12345678' example_db -R | gzip > example_db_data_routines.sql.gz
    mysqldump: [Warning] Using a password on the command line interface can be insecure.
    [root@test ~]# ll
    ...
    -rw-r--r--. 1 root root  317699305 Aug 21 15:43 example_db_data_routines.sql.gz

    The file size is reduced dramatically where the calculated compression ratio is about 18.5%. Please note that, you have to decompress the file before importing SQL into a database.

    Backup Data and Routines with Highest Degree Compression

    Compression is good, but it has a price. You have to trade off CPU resource against disk space. The highest degree of gzip is 9, which may let you pay more, but get less.

    [root@test ~]# mysqldump -u root -p'12345678' example_db -R | gzip -9 > example_db_data_routines.sql.gz9
    mysqldump: [Warning] Using a password on the command line interface can be insecure.
    [root@test ~]# ll

    -rw-r--r--. 1 root root  310848291 Aug 21 15:47 example_db_data_routines.sql.gz9

    The compression ratio is somewhat improved, about 18.1%, better but not impressive.

    Backup or Migrate Database to a Local One

    Not only files, you can also backup the whole database to another one, either a local or remote database. In practice, we regularly use the technique as if we take a snapshot of the database.

    [root@test ~]# mysqldump -u root -p'12345678' exmaple_db -R | mysql -u root -p'12345678' -D exmaple_db_backup
    mysql: mysqldump: [Warning] Using a password on the command line interface can be insecure.
    [Warning] Using a password on the command line interface can be insecure.

    Since we are in the same server, there's no need to compress the output.

    Backup or Migrate Database to a Remote One

    To backup or migrate the whole database to a remote one, we have to think over the network condition. Chances are, we could transfer the data over the internet which is much slower than LAN. Therefore, we compress the output and then pipe the stream to a remote server via SSH tunnel in this example.

    [root@test ~]# mysqldump -u root -p'12345678' exmaple_db -R | gzip | ssh 10.10.10.10 "gunzip | mysql -u root -p'12345678' -D exmaple_db_backup"
    mysqldump: [Warning] Using a password on the command line interface can be insecure.
    mysql: [Warning] Using a password on the command line interface can be insecure.

    As you can see, we decompressed the output stream before we fed it into the database.

    More Options

    Any of the above use cases can be added some more options to meet your requirements. These techniques can be used in shell scripts and scheduled for your routine backup jobs.

    Exclude Some Tables

    Whenever you want to exclude a big or unchanged table for saving some space, you can add an option --igonore-table to this command.

    [root@test ~]# mysqldump -u root -p'12345678' example_db --ignore-table=example_db.a_very_big_table > example_db_data.sql

    Add a timer

    For knowing how much time it will consume, you can add time ahead of command.

    [root@test ~]# time mysqldump -u root -p'12345678' example_db > example_db_data.sql
    mysqldump: [Warning] Using a password on the command line interface can be insecure.

    real    2m34.493s
    user    2m46.269s
    sys     0m19.949s

    Put Jobs to Background

    To avoid accidental interruptions, you can run the job on the background by adding a leading nohup and a tailing & before issuing the command.

    [root@test ~]# nohup mysqldump -u root -p'12345678' example_db > example_db_data.sql &
    [1] 2602
    ...
    [root@test ~]# jobs
    [1]+  Running                 nohup mysqldump -u root -p'12345678' example_db -R | gzip -9 > example_db_data_routines.sql.gz9 &

    That's it. I hope them helpful.

    Leave a Reply

    Your email address will not be published. Required fields are marked *