You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 7 Next »

I happened to back up a MySQL database into one huge dump SQL file, which was 123G Bytes. When I tried to restore it to another computer, I found the processing was very slow (especially the enable key step for a table). I decide to abandon some tables, but I found for such a large file, there is no good way to edit it. Even for tasks like just deleting the first line or last line are very difficult. So I wrote a small python code to split one huge dump file into small pieces.

The idea was simple. I just try to split the input file by the patterns like

--
-- blah blah blah
--
No files shared here yet.

Usage:

dumpSplitter -i inputFileName -p prefix -n initSerial

Example:

dumpSplitter -i mydb.dump -p abc_ -n 1000

will read in mydb.dump and generate files abc_1000, abc_1001, ...

  • No labels