Unsupported Screen Size: The viewport size is too small for the theme to render properly.

utilitee for part sql file

forums forums SQLyog SQLyog: Bugs / Feature Requests utilitee for part sql file

  • This topic is empty.
Viewing 9 reply threads
  • Author
    Posts
    • #8931
      just4fun
      Member

      Can you make small utilitie for part *.sql dumb by exp. 800 kb

      HTTP tun. on my machine can't execude dumb > 1mb (or connection lost, or simple upload 700 + kb and then do nothing)

      in this situation i must manualy part my 9 mb dumb by 500 kb =( (or use PMA and part by 1.5 mb)

      sorry, and thx.

    • #17495
      peterlaursen
      Participant

      I have exactly the same problem right !

      Didn't have it before.

      I'm struggling with a 12 MB sqldump/upload right now!

      What are they thinking at the ISP's …

      I guess I'd better find a professional one!

    • #17496
      peterlaursen
      Participant

      I guess I found something interesting!!!

      The problem is that the individual SQL-statements (insert into's blocks) are too long are too long!

      It's not the file as such, but the SQL-statements. Thats for HTTTP-tunneling with my ISP

      In my case they contain each about 3000-4000 records and takes up about 1 MB within the file each.

      If I divide each statement into 3 or four the whole dump runs …

      RITESH .. why must the statements be so long ??? probably it's fine with direct connection on port 3306 on a LAN or fast DSL.

      But it seems to be a probelm with tunnelling.

      Think about a setting to let the user decide or a popup “Use this SQL with tunnelling ?”

      I'll verify and report back!

    • #17497
      peterlaursen
      Participant

      Confirmed !

      The individual SQL statments in the “dump are MUCH too big for tunnelling (to the server) with a typical 128 kbs DSL line.

      Divding each statement into 3.4 pieces works form here!

      12,5 MB uploaded then!

      Probably the idea was the the client should not negotiate the connection too often, but with tunneling that would be better!

    • #17498
      Ritesh
      Member

      While generating the dump, uncheck Generate bulk insert stmts. in the option dialog. This will result in individual INSERT INTO… query being generated for each row of data.

      Hope that helps.

    • #17499
      peterlaursen
      Participant

      oh … it was there 🙂

    • #17500
      vygi
      Member

      It's an old topic… but nevertheless:

      maybe it would be possible to get a new config parameter “max bulk statement size” (set it eg. to 128 KB by default), and then divide bulk insert statements into that pieces?

      Right now I've got timeout error because it took too log to upload a 500KB bulk insert statement. It worked when I've manually divided it into two ca. 250 KB parts.

      Of course It is possible to export single statements but than it takes much longer to execute them.

      Regards,

      Vygi

    • #17501
      peterlaursen
      Participant

      Actually I have requested too that 'Bulk Size' could be user settable.

    • #17502
      vygi
      Member
      peterlaursen wrote on Nov 19 2005, 12:33 PM:
      Actually I have requested too that 'Bulk Size' could be user settable.

      [post=”7935″]<{POST_SNAPBACK}>[/post]

      Yes, it should be configurable.

      BTW, max query size depends not only on MySQL server settings.

      In my case, remote server was able to process up to 1 MB at once but has reported time out error because of low upload speed.

    • #17503
      peterlaursen
      Participant

      And if you use HTTP-tunnelling it also involves php-configuration.

Viewing 9 reply threads
  • You must be logged in to reply to this topic.