forums › forums › SQLyog › Using SQLyog › Error Importing CSV File
- This topic is empty.
-
AuthorPosts
-
-
February 28, 2005 at 8:28 pm #8825bimpMember
Using SQLyog Enterprise v4.01 and having trouble importing a CSV file. Keep getting the error message:
“Error while importing data”. Don't get any other helpful error message in History tab.
I have a table that consists of the following:
exp_pricelist_prices
==============
Field Type Null Key Default Extra Privileges
price_id smallint(6) unsigned PRI (NULL) auto_increment select,insert,update,references
printsize_id smallint(6) 0 select,insert,update,references
pricelist_id smallint(6) 0 select,insert,update,references
price float 0 select,insert,update,references
sort tinyint(4) 50 select,insert,update,references
In the import dialog box, I make sure to de-select the 'price_id' primary key column. My csv file only contains
the other 4 field data. I also have the fields terminated by comma and have tried both n and rn for line
terminators. I also make sure to ignore the first line which contains the field headers.
I've attached my CSV file. Please help.
-
February 28, 2005 at 8:29 pm #17098bimpMember
Using SQLyog Enterprise v4.01 and having trouble importing a CSV file. Keep getting the error message:
“Error while importing data”. Don't get any other helpful error message in History tab.
I have a table that consists of the following:
exp_pricelist_prices
==============
Field Type Null Key Default Extra Privileges
price_id smallint(6) unsigned PRI (NULL) auto_increment select,insert,update,references
printsize_id smallint(6) 0 select,insert,update,references
pricelist_id smallint(6) 0 select,insert,update,references
price float 0 select,insert,update,references
sort tinyint(4) 50 select,insert,update,references
In the import dialog box, I make sure to de-select the 'price_id' primary key column. My csv file only contains
the other 4 field data. I also have the fields terminated by comma and have tried both n and rn for line
terminators. I also make sure to ignore the first line which contains the field headers.
I've attached my CSV file. Please help.
-
February 28, 2005 at 8:33 pm #17099RiteshMember
Can you cut-n-paste the CREATE TABLE STMT… for the table?
-
February 28, 2005 at 8:39 pm #17100bimpMember
Ritesh, thanks for replying. Here it is:
CREATE TABLE `exp_pricelist_prices` (
`price_id` smallint(6) unsigned NOT NULL auto_increment,
`printsize_id` smallint(6) NOT NULL default '0',
`pricelist_id` smallint(6) NOT NULL default '0',
`price` float NOT NULL default '0',
`sort` tinyint(4) NOT NULL default '50',
PRIMARY KEY (`price_id`)
) TYPE=MyISAM;
-
February 28, 2005 at 9:02 pm #17101RiteshMember
I was successfuly able to import 92 rows of data in the file without any problem. The screenshots depict the steps I followed.
-
February 28, 2005 at 9:03 pm #17102RiteshMember
Create table using the statement provided by you.
-
February 28, 2005 at 9:04 pm #17103RiteshMember
Select Table -> Export/Import -> Import data from CSV… and enter the correct values.
-
February 28, 2005 at 9:04 pm #17104RiteshMember
Click Import
-
February 28, 2005 at 9:05 pm #17105RiteshMember
Which version of MySQL are you using?
-
February 28, 2005 at 9:11 pm #17106bimpMember
Ritesh, very interesting.
I'm using 3.23.58 on Windows XP
-
February 28, 2005 at 9:18 pm #17107bimpMember
Ritesh, I finally got it to work.
This time around the only difference was that my escape string was \ instead of
-
February 28, 2005 at 9:19 pm #17108RiteshMemberCode:version()
————–
3.23.58-max-ntSame result.
My history tab shows the following query.
Code:load data local infile 'D:/Documents and Settings/Administrator/Desktop/Fulfillment_Catalog_exp_pricelist_prices.txt' into table `bimp`.`exp_pricelist_prices` fields escaped by '\' terminated by ',' lines terminated by 'rn' ignore 1 lines ( `printsize_id`, `pricelist_id`, `price`, `sort` ) -
February 28, 2005 at 9:34 pm #17109bimpMember
I really don't know why it worked this time around. I'm just gald it works… maybe your good karma did it. I was able to push the data to all my dev, staging, and live databases.
Thanks.
-
-
AuthorPosts
- You must be logged in to reply to this topic.