I happened to back up a MySQL database into one huge dump SQL file, which was 123G Bytes. When I tried to restore it to another computer, I found the processing was very slow (especially the enable key step for a table). I decide to abandon some tables, but I found for such a large file, there is no good way to edit it. Even for tasks like just deleting the first line or last line are very difficult. So I wrote a small decide to split the file. After goggling a while and testing some of them, I could not find a good tool to achieve this. I also tried csplit (which have a line limit of 2048), and some script written in sed, ssed and awk, but no luck. Finally I decided to write a small piece of python code to split one the huge dump file into small pieces, and it rocks.
The idea was simple. I just try to split the input file by the patterns like
No Format |
---|
-- -- blah blah blah -- |
Attachments | patterns | dumpSplitter.py
---|
Usage:
No Format |
---|
dumpSplitter -i inputFileName -p prefix -n initSerial |
...