April 28th, 2000, 01:59 PM
what is the fastest way to import a lot of data in my database? I could use the LOAD DATA INFILE - statement but as far as i understand it this way i could only import data in one table!
I have six tables and all are connected (primary key of table a is also saved in table b and so on).
So what could i do? Do i have to use C/Perl/PHP?
Thanks for your ideas!
In MySQL, there is no physical connection between tables (i.e. pointers), relations are a logical concept. So if you have 6 files of tab delimited records, you can just import each one of them into your database. Then you use C/Perl/PHP to define the relations to use when querying.
This is assuming that the data already contains the id numbers that each table uses to connect to the others. If your record ids are built using auto-increments, you'll have to write some scripts that load one of the files, then query the whole thing and build the next table.
Another option you may consider:
Most SQL compliant engines offer an SQL dump that exports table structure and data as SQL CREATE and INSERT statements.
e.g. in mysql you could do this (from the command line)
mysqldump -u user -ppassword database > /path/to/where/you/want/dump.sql
Then to import to mysql:
mysql -u user -ppassword database < /path/to/dump.sql
Note that you have to create the database first.
So check the docs for the SQL engine you are using to see if it supports a dump.