September 30th, 2000, 09:35 PM
-
I am trying to insert a text file into mysql. The files is a 129 meg comma delimited file. It has 3313648 lines of information, 3 columns each line.
partnumber linenumber productinfo
The partnumber column has about 45 lines that are the same value.
The linenumber column is numbered 1 thru x with x being the number of rows in this products description.
The productinfo column has a 70 character description of the product.
What is the fastest way to load this file? It is taking over 4 hours already and still seems to be going?
After I get it loaded I want to take all the rows that have the same value in partnumber and get the information from productinfo and put this in a different table. I can not figure out how to do this either.
So new table would look like this
partnumber productinfo
The reason I want to do this is to speed up the searches instead of looking through 3313648 rows it will look through 69813 rows.
This is what I get with:
mysqladmin -i10 processlist status
+----+------+-----------+----------+---------+-------+-------+------------------------------------------------------------------------------------------------------+
| Id | User | Host | db | Command | Time | State | Info |
+----+------+-----------+----------+---------+-------+-------+------------------------------------------------------------------------------------------------------+
| 36 | root | localhost | shopcart | Query | 11560 | | LOAD DATA INFILE '/home/masters/NOTESFUL.TXT' INTO TABLE shoptech FIELDS TERMINATED BY ',' LINES TER |
| 39 | root | localhost | | Query | 0 | | show processlist |
+----+------+-----------+----------+---------+-------+-------+------------------------------------------------------------------------------------------------------+
Uptime: 13462 Threads: 2 Questions: 1314 Slow queries: 6 Opens: 55 Flush tables: 2 Open tables: 1 Queries per second avg: 0.098
Does everything seem right?
Larry