When you want to import a large MyISAM table from a file and the table has many indexes, this may help you.
- Create the table properly (with index)
- Flush table
$ mysqladmin flush-tables -u a_db_user -p
- Removes all use of indexes for the table
$ sudo myisamchk --keys-used=0 -rq /var/lib/mysql/dbName/tblName
- LOAD DATA LOCAL INFILE. Adjust the parameters corresponding to your csv file.
$ mysql -u your_user_name -p mysql> use your_db_name; mysql> LOAD DATA LOCAL INFILE 'your_data_file.csv' INTO TABLE `your_tbl_name` FIELDS TERMINATED BY ';' ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' IGNORE 1 LINES;
- Re-create the indexes
$ sudo myisamchk --key_buffer_size=1024M --sort_buffer_size=1024M -rq /var/lib/mysql/dbName/tblName
- Flush table
$ mysqladmin flush-tables -u a_db_user -p
LOAD DATA example
LOAD DATA
LOAD DATA LOCAL INFILE '/home/xxx/data.csv' INTO TABLE mydb.mytable -- make sure the table name is correct! FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' (@id, field_a, field_b, @flag, @filetime) SET id = @id+10000, filetime = FROM_UNIXTIME(@filetime), flag=IFNULL(@flag,0);
Refer to mysql.com and syntax for more details.
Test environment
Ubuntu 10.10
Mysql server 5.1