Unsupported Screen Size: The viewport size is too small for the theme to render properly.

Odbc Import And Errors With Sja.exe

forums forums SQLyog Using SQLyog Odbc Import And Errors With Sja.exe

  • This topic is empty.
Viewing 9 reply threads
  • Author
    Posts
    • #10509
      David G
      Member

      I am a new user, and I am attempting to import table data using a SQL query from a network database to a local database using Migration Toolkit. I was able to successfully import the data once as a test. Afterwards, I dropped the resulting new table, and attempted to import the complete table dataset again, and I get a Windows error each time, saying that SJA.exe has encountered a problem and needs to close. There is nothing in the log for this error that tells me what the problem is. Does anyone have any idea what may be causing this? I am using version 5.19 If other info will be helpful in the diagnosis, please let me know. Thanks!

    • #24759
      peterlaursen
      Participant

      If you are a new user how could it happen that you are using an old version? We stopped distributing this version back in October 2006. I also find no reasonable match in our customer database with your details (name and email address). SQLyog Enterprise is NOT free software!

      Basically if makes no idea to discuss any such issue with version 5.19 (we cannot fix issues in old versions – only in new versions!). Please test with latest version if problems persist. You can download a TRIAL from our download page if you are not able to download a registered version. The TRIAL is restricted to import max. 2 tables per job, but no other restrictiin for the month it will run.

      If they do we need (for a start) more information about

      *what kind of database it is (including version). 'network database' does not tell much!

      *information about the odbc driver

      (we will probably need more specific info after that, but what we need depends on the answer to those questions)-

      But first of all try 6.05 or 6.06!

    • #24760
      David G
      Member
      peterlaursen wrote on Aug 28 2007, 04:40 PM:
      If you are a new user how could it happen that you are using an old version? We stopped distributing this version back in October 2006. I also find no reasonable match in our customer database with your details (name and email address). SQLyog Enterprise is NOT free software!

      Basically if makes no idea to discuss any such issue with version 5.19 (we cannot fix issues in old versions – only in new versions!). Please test with latest version if problems persist. You can download a TRIAL from our download page if you are not able to download a registered version. The TRIAL is restricted to import max. 2 tables per job, but no other restrictiin for the month it will run.

      If they do we need (for a start) more information about

      *what kind of database it is (including version). 'network database' does not tell much!

      *information about the odbc driver

      (we will probably need more specific info after that, but what we need depends on the answer to those questions)-

      But first of all try 6.05 or 6.06!

      I am using an existing licensed copy, probably purchased by my predecessor, Jay Shi. I won't be upgrading because I plan to move away from MySQL. I am looking for suggestions to work through or around the issue I described in my original post. I can't believe I am the only one that has ever had an ODBC import work once, and then fail when attempted again. Is there anyone here who can assist?

    • #24761
      David G
      Member
      David G wrote on Aug 28 2007, 11:20 PM:
      I am using an existing licensed copy, probably purchased by my predecessor, Jay Shi. I won't be upgrading because I plan to move away from MySQL. I am looking for suggestions to work through or around the issue I described in my original post. I can't believe I am the only one that has ever had an ODBC import work once, and then fail when attempted again. Is there anyone here who can assist?

      I will post the additional requested info tomorrow morning from the office.

    • #24762
      peterlaursen
      Participant

      If the full name of Jay Shi is 'Jay Jianxin Shi' we have a match in our database. Original purchase 2005-08-12; version 5.19 would than be the latest free upgrade for this purchase.

      We will need more info about the type of database and maybe also the structure and data! What we know now is only like 'I start my car, but the engine goes out immediately. But once it worked' . What kind of car, how is it fuelled etc …. ?

      Please also explain what was the difference when it succeded and not. Only the amount of data? Can you repeat the success?

      If do not believe in 'workarounds' for tis! If there was an issue with 5.19 and still is in 6.x we will of course fix it. But fix is possible only in future versions!

      If you import more tables you can try one table per job to find the problematic one.

    • #24763
      David G
      Member
      peterlaursen wrote on Aug 29 2007, 04:18 AM:
      If the full name of Jay Shi is 'Jay Jianxin Shi' we have a match in our database. Original purchase 2005-08-12; version 5.19 would than be the latest free upgrade for this purchase.

      We will need more info about the type of database and maybe also the structure and data! What we know now is only like 'I start my car, but the engine goes out immediately. But once it worked' . What kind of car, how is it fuelled etc …. ?

      Please also explain what was the difference when it succeded and not. Only the amount of data? Can you repeat the success?

      If do not believe in 'workarounds' for tis! If there was an issue with 5.19 and still is in 6.x we will of course fix it. But fix is possible only in future versions!

      If you import more tables you can try one table per job to find the problematic one.

      Yes, Jianxin Shi is the correct name.

      I am attempting to import data from an Oracle database using a SQL query. The query is (schema names deleted and sensitive info masked):

      select oh.created as “createdate”,

      a.name as “accountnum”,

      oi.accnt_Order_num as “ordernum”,

      oi.x_show_num as “shownum”,

      i.name as “itemnum”,

      pl3.name as “dept”,

      pl5.name as “division”,

      pl2.name as “category”,

      pl1.name as “subcategory”,

      oi.x_total_amt as “ordertotal”,

      oi.x_total_sales_amt as “totalprice”,

      oi.QTY_REQ as “qty”,

      cr.X_COMMENT_NUM as “commentnum”,

      cr.X_COMMENT as “comment”,

      oh.X_ADDNL_COMMENT as “additionalcomment”,

      e.login as “login”,

      e.fst_name as “firstname”,

      e.last_name as “lastname”,

      e.emp_num as “empnum”

      from cx_s_order_item_hist oh,

      s_order_item oi,

      cx_s_comment_rule cr,

      s_employee e,

      s_order o,

      s_org_ext a,

      s_prod_int i,

      s_prod_ln pl1,

      s_prod_ln pl2,

      s_prod_ln pl3,

      s_prod_ln pl4,

      s_prod_ln pl5

      where oh.created between ('13-AUG-2007') and ('28-AUG-2007')

      and oh.x_comment_rule_id in

      ('9-999999',

      '9-999999',

      '9-999999',

      '9-999999',

      '9-999999',

      '9-999999')

      and oh.par_row_id = oi.row_Id

      and oh.x_comment_rule_id = cr.row_id

      and oh.created_by = e.row_id

      and oi.order_id = o.row_id

      and o.accnt_id = a.row_id

      and oi.prod_id = i.row_id

      and i.PR_PROD_LN_ID = pl1.row_id

      and pl1.PAR_PROD_LN_ID = pl2.ROW_ID

      and pl2.par_prod_ln_id = pl3.row_id

      and pl3.par_prod_ln_id = pl4.row_id

      and pl4.par_prod_ln_id = pl5.row_id

      The amount of data was the same for both the successful and unsuccessful imports, at 8,782 rows. Here is a sample record. I have changed some of the sensitive info in the record to all 9 or all x, truncated because of space limitations here:

      08/13/2007 09:35:49 999999999 999999999 99 999999 DEP-99 DIV-9 CAT-999

      SC-999-99 144.9000000 144.9000000 1.0000000 670

      Order confirmed by xxxxx no customer contact xxxxx xxxxxxx x xxxxxx 9999999999

      I am letting the migration toolkit create a new table for me, and the columns are being defined as depicted in the attached Word doc called ODBC Import.

      I have not been able to successfully import since the first try early yesterday, and I receive the same error every time. Ideas or thoughts?

    • #24764
      David G
      Member
      David G wrote on Aug 29 2007, 09:50 AM:
      Yes, Jianxin Shi is the correct name.

      I am attempting to import data from an Oracle database using a SQL query. The query is (schema names deleted and sensitive info masked):

      select oh.created as “createdate”,

      a.name as “accountnum”,

      oi.accnt_Order_num as “ordernum”,

      oi.x_show_num as “shownum”,

      i.name as “itemnum”,

      pl3.name as “dept”,

      pl5.name as “division”,

      pl2.name as “category”,

      pl1.name as “subcategory”,

      oi.x_total_amt as “ordertotal”,

      oi.x_total_sales_amt as “totalprice”,

      oi.QTY_REQ as “qty”,

      cr.X_COMMENT_NUM as “commentnum”,

      cr.X_COMMENT as “comment”,

      oh.X_ADDNL_COMMENT as “additionalcomment”,

      e.login as “login”,

      e.fst_name as “firstname”,

      e.last_name as “lastname”,

      e.emp_num as “empnum”

      from cx_s_order_item_hist oh,

      s_order_item oi,

      cx_s_comment_rule cr,

      s_employee e,

      s_order o,

      s_org_ext a,

      s_prod_int i,

      s_prod_ln pl1,

      s_prod_ln pl2,

      s_prod_ln pl3,

      s_prod_ln pl4,

      s_prod_ln pl5

      where oh.created between ('13-AUG-2007') and ('28-AUG-2007')

      and oh.x_comment_rule_id in

      ('9-999999',

      '9-999999',

      '9-999999',

      '9-999999',

      '9-999999',

      '9-999999')

      and oh.par_row_id = oi.row_Id

      and oh.x_comment_rule_id = cr.row_id

      and oh.created_by = e.row_id

      and oi.order_id = o.row_id

      and o.accnt_id = a.row_id

      and oi.prod_id = i.row_id

      and i.PR_PROD_LN_ID = pl1.row_id

      and pl1.PAR_PROD_LN_ID = pl2.ROW_ID

      and pl2.par_prod_ln_id = pl3.row_id

      and pl3.par_prod_ln_id = pl4.row_id

      and pl4.par_prod_ln_id = pl5.row_id

      The amount of data was the same for both the successful and unsuccessful imports, at 8,782 rows. Here is a sample record. I have changed some of the sensitive info in the record to all 9 or all x, truncated because of space limitations here:

      08/13/2007 09:35:49 999999999 999999999 99 999999 DEP-99 DIV-9 CAT-999

      SC-999-99 144.9000000 144.9000000 1.0000000 670

      Order confirmed by xxxxx no customer contact xxxxx xxxxxxx x xxxxxx 9999999999

      I am letting the migration toolkit create a new table for me, and the columns are being defined as depicted in the attached Word doc called ODBC Import.

      I have not been able to successfully import since the first try early yesterday, and I receive the same error every time. Ideas or thoughts?

      The ODBC connection is Microsoft ODBC for Oracle

    • #24765
      peterlaursen
      Participant

      I will ask a test engineer to look into it.

      But we will check against the 6.06 and 6.1 code tree primarily.

      What is your windows version?

      Did you try rebooting after gettings those errors?

    • #24766
      David G
      Member
      peterlaursen wrote on Aug 29 2007, 09:57 AM:
      I will ask a test engineer to look into it.

      But we will check against the 6.06 and 6.1 code tree primarily.

      What is your windows version?

      Did you try rebooting after gettings those errors?

      I did reboot several times to no avail.

      I am running our company standard Windows xp Professional v 5.1 Service Pack 2.

    • #24767
      peterlaursen
      Participant

      We have not been able to reproduce it. However I also think we need more exact information.

      We simply need a schema populated with data that crashes for you. A DUMP actually (why give us 20 minutes of typing work if we can import a DUMP in 5 seconds?)

      You can fake some data and/or create a ticket if you do not want to expose things in public. But this crash must be happening with this schema and data. Error may be data specific and a single character may make the difference!

      Also I still request that you try version 6.x ENTERPRISE TRIAL.

      Basically you should understand that we do in principle not support this version any more.

      At least we will have to ask for an exact and reproducable test case to work with, if we shall work with it.

Viewing 9 reply threads
  • You must be logged in to reply to this topic.