Unsupported Screen Size: The viewport size is too small for the theme to render properly.

Forum Replies Created

Viewing 15 posts - 1 through 15 (of 15 total)
  • Author
    Posts
  • in reply to: Connecting To 1And1 Mysql Databases #33123

    here's the phpinfo file…

    'eddierosenthal' wrote:

    i created a subfolder for the tunnel file and an empty .htaccess file, chmod 644 etc. with no change in result;

    test connection has no errors but shows the web page.

    those files are not in the physical root folder, and not in the forwarded sub directory of the account, where the document folders live,

    but in its own new subfolder and have the connection tunneler now pointing to that…

    i believe it is a shared account, so no ability to view or change the apache conf files, at least i think that is the case.

    i have the output of phpinfo() though… what next?

    thanks for your prompt replies.

    in reply to: Connecting To 1And1 Mysql Databases #33122

    i created a subfolder for the tunnel file and an empty .htaccess file, chmod 644 etc. with no change in result;

    test connection has no errors but shows the web page.

    those files are not in the physical root folder, and not in the forwarded sub directory of the account, where the document folders live,

    but in its own new subfolder and have the connection tunneler now pointing to that…

    i believe it is a shared account, so no ability to view or change the apache conf files, at least i think that is the case.

    i have the output of phpinfo() though… what next?

    thanks for your prompt replies.

    'peterlaursen' wrote:

    Ok .. you are using HTTP tunnel and there is a global mod_rewrite setting or similar in Apache configuration.

    I think it will be a solution to create an emptu (sub-)folder, copy the tunneler file there and turn off all Apache extensions for this particular folder. This you can do by putting an empty .htaccess file in the folder. Could you try this (if you have privileges)? If still no success we will involve some of our server maintenance people tomorrow. It will then speed up if you could share Apache configuration (httpd:conf) and phpinfo() from this server in advance. You should not do here in this Forums. You can create a private ticket by sending a mail to [email protected].

    in reply to: Connecting To 1And1 Mysql Databases #33120

    hi – not getting an error, its just not connecting to the database. in stead i see ( see the pic attached)

    the http request and the presentation of the web server – in other words his home page.

    the sql tunnel file is correctly chmod 644, no errors in version there either.

    no error in testing connection – no 500 or anything like that.

    could this mean i do not have the correct server host name?

    thanks

    'sathish' wrote:

    Hello,

    Can you please tell us what is the exact error message you are getting when trying to connect? Could you please attach a screen shot of the error message?

    Also you can refer this FAQ link for http connection issues: http://webyog.com/faq/category/21/https_tunneling.html

    Regards,

    Sathish

    in reply to: Scheduled Job Success And Not Success #33116

    i attach a pic of the error while scheduling, also, a pic of a possible solution…

    and then a pic of the email wherein it says success… and not success..

    what do you think of the solution here?

    'ashwin' wrote:

    Hi,

    We are checking this issue at our end. But we need the some more information-

    What error are you getting on Saving the job? Are you getting 'Login failure: unknown user or password' error? Kindly attach the screenshot of the error message.

    Could you empty the log file and try to run the job file again and send us the log file?

    Are you trying to import using 'Execute SQL script'? What error did you get when you tried to import? Please empty the SQLyog.err file and try restoring using 'Execute SQL script' and send us the file.

    Also, tell your MySQL server version(SELECT VERSION()) where you are trying to import the SQL file(We can see from your dump that you used MySQL 5.1.53 server to backup the database).

    Regards,

    Ashwin A

    in reply to: Error While Importing External Data #32637
    'peterlaursen' wrote:

    Is it a 64 bit or 32 bit win7?

    its a 64 bit os. but the re install seemed to fix it…

    e

    in reply to: Looking For A Backup And Keep Old Backups #32408

    yes that was it – precisely.

    thanks again!

    in reply to: Looking For A Backup And Keep Old Backups #32407
    'peterlaursen' wrote:

    Well .. yes – if I understand!.

    Use the 'scheduled backup' feature (from 'powertools' menu). In the wizard there is an option to add a timestamp to the file name. And the ZIP option will probably be useful too. You may do so on you local machine or on the server using SJA for Linux.

    FAQ links: http://webyog.com/faq/category/25/sqlyog-job-agent-sja.html

    in reply to: Error In Sync Job #31738
    'peterlaursen' wrote:

    I can also see that you are using HTTP tunnel. That could very well be part of the problem. I think there were some fixes lately. So please try 8.71.

    ok i purchased the upgrade installed and copied the SQLyogTunnel.php to destination. same error – here it is.

    Sync started at Sun Dec 26 11:05:01 2010

    `wp_posts` 30062 0 Error No. 1

    Error in tunneling. Please send the HTTP response to http://www.webyog.com/support

    this was during a direct sync from the sqlyog ultimate program latest version.

    i will try the cmd line interface next.

    ok that failed.

    i could not attach the data, it was too big to upload.

    in reply to: Error In Sync Job #31690
    'nithin' wrote:

    FYI: If you do not want to delete the extra rows in the target table you have to select the option 'Don't delete extra rows in target database'.

    Your sync output looks like you have not selected this option and as a result the extra rows in target are deleted.

    See the screen shot attached.

    Also the “-r” option is required only if the target table is empty. I suggested this option because your 1st post tells the target table is empty.

    `node_revisions` 33024 0 Error No. 1

    The version 8.62 change-log tells:

    — SJA (Data sync) now supports an additional -r parameter that tells how big CHUNKS should be when copying to an empty table.

    by definition 'extra rows' i take to mean duplicate rows even if the target table does not have a primary index.

    that there is some parsing of the row to determine if it is a duplicate row…?

    also is there an interface features the row control similar to the -r option in the command line?

    thanks again for your help.

    in reply to: Error In Sync Job #31687

    perhaps i was fooled again, it appears the job successfully inserted rows.

    in reply to: Error In Sync Job #31686
    'nithin' wrote:

    We are looking into this issue. We are almost replicated this at our end.

    The error is because the Tunneler is not able to handle the large size query(Bulk insert statement)sent, one guess is lack of memory allocated to PHP at server side.

    As a work around you can make the Bulk insert query size small by the following:-

    – Save the data sync job file

    – Run the job from the command prompt >sja dsync_job.xml -r20

    (this -r option retrieves 20 rows at a time from the source server and frame BULK INSERT query to the tunneler, by default it was 1000 rows. You can try different values also)

    – It was working at our end. Please try this and let us know the status.

    We will check this issue in detail Tomorrow and update you.

    i saved the job file as sync-node-revisions-table.xml and from the cmd prompt ran it. see attached file for image of output.

    'The data sync script has been generated at etc…'

    ran it twice with same results. perhaps i am doing something wrong here, dunno.

    in reply to: Error In Sync Job #31684
    'nithin' wrote:

    Hello,

    Please tell the SQLyog version?

    With HTTP, the error details are not explained properly.We will take it up in an upcoming version.

    This problem can happen due to network problems also. Does this always happen when you try to sync this particular table? Please confirm.

    I could find that your target table is empty and in such situation we frame the *Bulk Insert query and execute against the target. Can you check the “max_allowed_packet” for both the target and the source. If the “max_allowed_packet” size is less in case of the target server then the INSERT for single row itself can fail to target.

    So please give us the following:-

    – Execute query SHOW VARIABLES LIKE 'max_allowed_packet'; for both source and target and paste the output here.

    – Do you have any BLOB/TEXT column for the table? Please execute the following query in the source to find the longest value stored:

    SELECT max(length(long_column_name)) FROM the_table;

    – Can you provide us the table structure only?

    You can create a support ticket and we will continue from there

    http://webyog.com/support/ttx.cgi

    Hi – thanks for your quick response.

    It does happen on this table, but this is the only one in this project where it does happen.

    I have used it fine on other db's, other tables…

    i am trying to migrate a table that originated on another remote server,

    that now has been replicated on my localhost. So i am pushing the rows to the target now…

    I am using v8.14

    SHOW VARIABLES LIKE 'max_allowed_packet' on my local host is 1048576

    SHOW VARIABLES LIKE 'max_allowed_packet' on my remote host is 16776192

    — so it is larger in the target database.

    there is longtext – here is the structure – it is a drupal table.

    CREATE TABLE node_revisions (

    nid int(10) unsigned NOT NULL default '0',

    vid int(10) unsigned NOT NULL auto_increment,

    uid int(11) NOT NULL default '0',

    title varchar(255) NOT NULL default '',

    body longtext NOT NULL,

    teaser longtext NOT NULL,

    log longtext NOT NULL,

    `timestamp` int(11) NOT NULL default '0',

    format int(11) NOT NULL default '0',

    PRIMARY KEY (vid),

    KEY nid (nid),

    KEY uid (uid)

    ) ENGINE=MyISAM DEFAULT CHARSET=utf8;

    SELECT max(length(body)) FROM node_revisions was 15806 in the remote table

    in the local db it was 16130 in the body table

    in reply to: Monitoring Without Monyog? #30081
    'eddierosenthal' wrote on '01:

    sorry to be so dense.

    i am executing a query and simply want to know if there is a way to find out where it is, which statement it is on.

    sqlyog enterprise is what it is.

    statement a;

    statement b;

    etc… is it on a, b, or whichever?

    is there a way to know?

    later…

    i opened another instance of the sqlyog enterprise program and was able to examine the current process running. also under status is a variable for “last_query_cost” – which is 10.+++

    thanks for your interest in my problem.

    this part of the query took 7 hours:

    UPDATE t_cities a, t_counties b

    SET a.CountyId = b.CountyId

    WHERE a.CityAliasName = b.CityAliasName

    AND a.State = b.State;

    i have (just now) altered both t_cities and t_counties

    tables to include the CityAliasName as a fulltext key so i will see what

    happens the next time i do an update to the t_cities table.

    I have in tools->preferences-Powertools set enable query profiler and all the checks below that ON.

    question about that is – how much cost is there to have that profiler ON – is there a direct

    correlation between that ON and the LENGTH OF THE TIME OF THE QUERY EXECUTION?

    and during the query run – perhaps next time i will try to schedule it so that it runs at night…

    Question about that is – do you think it will/would run much faster on a shared hosted environment Linux MYSQL server on godaddy.com than it will run on my dual core windows xp pro desktop?

    in reply to: Monitoring Without Monyog? #30079
    'peterlaursen' wrote on '01:

    I simply do not understand!  'Monitoring without MONog' posted in 'SQLyog' category with reference to SJA.  Please explain what program you are using and what you are doing.

    sorry to be so dense.

    i am executing a query and simply want to know if there is a way to find out where it is, which statement it is on.

    sqlyog enterprise is what it is.

    statement a;

    statement b;

    etc… is it on a, b, or whichever?

    is there a way to know?

    later…

    i opened another instance of the sqlyog enterprise program and was able to examine the current process running. also under status is a variable for “last_query_cost” – which is 10.+++

    in reply to: Update Set Value From Other Table #30061
    'peterlaursen' wrote on '26:

    If you can access the information you can also check if there is significant increase of CPU and IO activity on the server while query is executing.  If there is nothing it looks like a 'deadlock'.

    I left for the day while the query was running, and came back surprised to find it did finish. I will look at your examples to see what I can see! thanks for your prompt reply on a day when most people would be comatose from too much to eat…Upon cursory examination it appeared to be a good result.

    all the best,

    eddie rosenthal

Viewing 15 posts - 1 through 15 (of 15 total)