Forum Replies Created
-
AuthorPosts
-
sub-zeroMember
Hi, Thanks for all your kind help, I have solved this problem.
Firstly I selected the right files of my.cnf, please read these files carefully, you can find my-huge.cnf,my-large.cnf… have some different setting according your RAM and MySQL using, as for me, I tried only my-large.cnf can work because we only have 768M RAM.
Secondly, I have to change the max_allow_ packard=700M, I do not know why only this value can work.
Thirdly, I changed the swap size from 1.5G to 2.5G using swapfile on Redhat9, you can search it on http://www.redhat.com.
Lastly, I execute backup Batch file using webmin mySQL manager.
Hope this is usefully for others.
Another things, do I need to unselect “Create Bulk Insert Statements' when I backup datas next time??
:))
sub-zeroMemberThanks for your help.
I am trying to add swap file in Redhat system.
Maybe it can work.
sub-zeroMemberThanks .
Here is our server's info:
[root@edms root]# free
total used free shared buffers cached
Mem: 771804 46216 725588 0 880 10780
-/+ buffers/cache: 34556 737248
Swap: 1566328 29668 1536660
[root@edms root]#
I found another topic similar with my problem here:
I have two computers, both running SQLYog 4.03. Windows XP Home and W98 SE.
If I run a batch file, it will pause every 64k, and sometimes stop with an error message. This happens with different hosts.
At the moment, I have to split larger files – I just split a 482k file into 9, to stay under this limit.
Any suggestions?
Is there any tools can split the **.sql into small pieces??
Thanks again for your help.
sub-zeroMemberThanks for your help first.
I use SQLyog V3.71 export data as SQL statements, and with default setting,these item have been select during backup.
1. include “USE dbname-” statement
2. add create database
3. include “drop table” statement
4. Lock all tables for read
5. Create Bulk Insert Statements
The Database server is Redhat 9.0+ MySQL V4.1.13
Yes, I got these info said it was LONGBLOBS type.
filedata longblob
-
AuthorPosts