Forum Replies Created
-
AuthorPosts
-
MDSDigitalMember
Peter,
We're running 5.5.
I have sent an email in to get private ticket and i can send you the SHOW CREATE TABLE. The data will be quite big (over 3 million records).
Thanks.
Lucas C.
'peterlaursen' wrote:OK .. we will check this. Also
1) please tell MySQL version?
2) If you can attach the result from SHOW CREATE TABLE for a table where this is reproducible it would be nice. Or even better provide a full dump with data for such table. If this is possible we will open a FTP upload option for you (and for privacy in that case you may continue in a private support ticket. You create such by sending a mail to [email protected]).
MDSDigitalMemberPeter,
We tried “CHUNK” on 1000 rows and 10000 and the same thing hapenned. All the tables are InnoDB.
Attached is a copy of MySQL and we're running SQLyog v.8.82.
Any ideas?
Thanks Much.
Lucas C.
'peterlaursen' wrote:Please be careful with the term 'crashing'. Is there a crash dump? Also I do not think it is 'locking' (in the SQL meaning of the word). Terms are important. You have not provided any information that gives a reasonable indication of neither a 'crash' or a 'lock' (in the real meaning of these words).
Please read: http://www.webyog.co…-and-bulks.html
Please try to set a CHUNK value. I think the server tries to use more memory than it is configured to be allowed to use or is swapping to disk. But tuning the server configuration could be a good idea too. Did you ever try (with MONyog or whatever) to check if your buffer settings in server configuration are reasonable? With default settings ('out of the box') for teh server you shall not expect it to handle that amount of data efficiently.
You may also
1) tell if tables are MyISAM or InnoDB (or something else)?
2) tell the server version and attach you mysql configuration file (my.ini/my.conf)
.. And please always tell the SQLyog version you are using when reporting an issue.
-
AuthorPosts