Unsupported Screen Size: The viewport size is too small for the theme to render properly.

Import HUGE table (~1G) via query

forums forums SQLyog Sync tools, Migration, Scheduled Backup and Notifications Import HUGE table (~1G) via query

  • This topic is empty.
Viewing 5 reply threads
  • Author
    Posts
    • #8406

      I'm new to SQLyog. I am attempting to import over 10G (~650M records) of data from Tandem NonStopSQL into MySQL in chunks. As a test, I tried to import a really small amount of data (~2M records). I noticed the following:

      1) after I input the query and click next, SQLyog became non-responsive for about 1/2 hr.

      2) during that 1/2 hr, SQLyog pulls data from Tandem.

      3) memory usage by SQLyog went up from ~12M to ~80M.

      4) after I selected the MySQL table where I want the imported data to go, the whole process repeated itself. so I waited for another 1/2 hr.

      5) when I choose to “save as SQL scripts in file”, the whole process repeated itself again. so I waited.

      6) at the end of event 5, network activity dropped to zero and cpu activity for SQLyog went from 0 to 99. In the end a 320M .sql file was created.

      Questions:

      1) First of all, it seems odd to me that I have to wait for ~ 1.5 hr just to find out if the whole process works or not. Am I doing this thing in the right way?

      2) How am I suppose to get 10G of data if SQLyog tries to store the data in memory first?

      3) Is there a more descriptive help file where I can get better instructions?

      Thanks.

    • #15715
      Ritesh
      Member

      I suspect this is an issue with the efficiency of the ODBC driver. Have you tried accessing the data thru any other ODBC client like Excel, Access?

    • #15716

      Here is what I have found by playing around with the NonStop SQL ODBC driver. When I switch the cursor mode from read_only_streaming to read_only, the high memory usage no longer happens. However, the data transfer is significantly affected. I'm down to 120 records/sec. That will take me 2 months to import 560M records!

      Looks like I'll have to find some other ways to get round this… Any suggestions?

    • #15717
      Shadow
      Member

      Can you export the data as csv file from your db?

    • #15718

      Since ODBC is the only way for me to communicate w/ the server, it doesn't matter what format of the data I request the performance issue will be there.

    • #15719
      Shadow
      Member

      Then this problem is beyond our capabilities…

Viewing 5 reply threads
  • You must be logged in to reply to this topic.