forums › forums › SQLyog › Sync tools, Migration, Scheduled Backup and Notifications › 2 mySQL Databases?
- This topic is empty.
-
AuthorPosts
-
-
September 11, 2004 at 4:30 am #8566tpattersonMember
Ok, bear with me here. I am not only just now learning how to use the SQLyog my learning how to use mySQL in general. When they invented the term “newbie” they were referring to me 😛
I have a mySQL database hosted on my server with about 35 tables in it. This is an x-cart database if anyone is familiar with it.
Due to the number of products that we have available (1,200,000+) I have 15 seperate CSV files on my local machine. This consists of 14 CSV files which contain the detailed information for each individual product. The 15th file is a master compilation of all 14.
GOAL:
To be able to somehow synchronize these CSV files (or preferably just the master) to update the 35 tables within the database itself. These CSV files contain information of which is split between the tables. Meaning that the “images” information goes into a table called “images” while the pricing goes into a table called “pricing”, etc.
Nearly every table has its own independant “record id” number which is based on a key column within the CSV files themselves. I need to be able to update only the lines in the tables which have different information in the CSV files. This could be a price change, in stock change, etc., but there is no need to simply overwrite the data in all lines of the table as there is only about 5% that change on a daily basis.
Is there a way to do this with all of the tables, extracting the information from the master (or individual) CSV files?
Thanks for any suggestions.
-
September 11, 2004 at 4:37 pm #16274peterlaursenParticipant
I just wonder: why do you store data permanently in .CSV-files?
Why not a table in a database ?
CSV-files are meant for backup and import/export purposes.
It's no problem to have two or more MySQL-servers on the network –
they just have to use different port-numbers.
I'm sorry if I missed the point!
-
September 12, 2004 at 3:34 pm #16275tpattersonMemberpeterlaursen wrote on Sep 11 2004, 11:37 AM:I just wonder: why do you store data permanently in .CSV-files?
Why not a table in a database ?
CSV-files are meant for backup and import/export purposes.
It's no problem to have two or more MySQL-servers on the network –
they just have to use different port-numbers.
I'm sorry if I missed the point!
I do not believe that I said anything at all about storing data permanently in a csv file. Our inventory is in fact stored, manipulated, and processed in an MS Access database. The data is then exported to csv and then imported into mySQL.
When you are dealing with over 1,250,000 products this makes for a rather larger csv file if you are to leave it in one chunk. We have 13 major categories of products, so we break the inventory exports into 13 seperate csv files. I have found it to be MUCH easier to manipulate chunks of approximately 100,000 items during an import (speed, possible errors, etc) rather than 1,250,000.
So I guess if you cannot see the rationality of that then I am sorry. Perhaps if you were to attempt an upload to your servers of 1,250,000 products (which by the way encompasses just a hair over 198MB of data) at one time you would see the reason for splitting them up.
Thank you.
-
September 13, 2004 at 6:00 pm #16276RiteshMember
One of the possible solutions that comes to my mind:
1) Start using MySQL instead of Access at the client-end.
2) Schedule a “Data Sync Job” using SQLyog. As required, SQLyog will only update the changed rows during the sync process. You can do away with exporting data to CSV, etc.
If you still need to use the “front-end” features of Access, use Access to make an ODBC connection to MySQL.
-
-
AuthorPosts
- You must be logged in to reply to this topic.