Blog

Splitting a very large MySQL dump file

Ever have a several gig MySQL dump and only need to extract a table or two out of it? Here’s something you can run from the command line that will split all of the tables into individual files. You’ll need to know if your dump file has the CREATE TABLE format included or is just the data. If it’s just the data then you can change where it says CREATE TABLE to LOCK TABLES below.

One thing to note. If awk ends up giving you an error such as “Program Limit Exceeded”, then you can use gawk instead and it should work without issues.

[sourcecode language="bash"]
cat dumpfile.sql | awk ‘BEGIN {
output = "comments"; } $data ~ /^CREATE TABLE/ {
close(output);
output = substr($3,2,length($3)-2); }
{ print $data >> output }’
[/sourcecode]

3 Responses to “Splitting a very large MySQL dump file”

  1. Jay says:

    If you need to split into databases, you can use “CREATE DATABASE” instead as the regex. If people find this hard to use, you can use any text file splitter that accepts regex values, then use CREATE DATABASE

  2. Mark says:

    I searched long for this script. Thanks,
    Mark

  3. Mark says:

    Rick – I used this code when I had DROP TABLE IF EXISTS
    [code][/code]cat dumpfile.sql | gawk ‘BEGIN {
    output = “comments”; } $data ~ /^DROP TABLE IF EXISTS/ {
    close(output);
    output = substr($5,2,length($5)-3) “.sql”; }
    { print $data >> output }’

    I also added an extension.

    Mark

Leave a Reply