Forum Replies Created
-
AuthorPosts
-
manoharMember
Hi,
This is an issue with Chrome/Safari. I have reported here: http://code.google.com/p/chromium/issues/detail?id=8600.
Thanks for reporting.
manoharMemberHi AlexH,
The checking/unchecking a connection is stored, if you select any of the actions in the dropdown and press “go” in “List of servers” page. For editing a connection it is not necessary to check/uncheck a connection. However if you do so, the state is not saved as you have described. We will discuss this.
Thanks for reporting.
manoharMemberHi,
We registered 2 connections, opened dashboard page, and left it over the weekend. When we checked on Monday morning it was working fine.
1) How many connection have you registered and how many connections are you viewing in dashboard?
2) Which browser are you using?
3) Is Monitors/Advisors page also open while you are viewing dashboard?
manoharMemberHi Boyd,
Currently we are working on improving performance of MONyog. We are doing MAJOR optimization with SQLite queries, when data collected by MONyog in GB's, this solves the problem of high disk I/O usage. 3.1 Beta 1 is expected in a week. Post 3.1, we will take up and discuss all the points mentioned by you. Thanks again for your involvement and encouraging words.
manoharMemberWilliam Vicary wrote on Feb 19 2009, 03:27 PM:Hi there,First I'd like to say, fantastic software and I will definitely be buying it probably irregardless of this thread! Fantastic!
I have two niggles which are just highly irritating, let me explain my implementation of this and then you may see my problem.
We run approximately 200 websites from one DB server and to be honest we haven't exactly been hot on optimisation of queries (which is now biting us on the foot!) we have a lot of queries going through the server across about 50 databases.
Now my problem, when viewing the query analyzer (sniffer as i do not have other setup currently) there is a few problems:
A) I cannot see which database a query originated, which as you can imagine is pretty hard to pin point when you have ~50 databases on the go!
😎 I cannot easily explain the query within this list, it would mean copying to sqlyog and running it from there!
I propose a similar implementation for at very least the sniffer which has “explain” available as well as which database the query originated, If this is available and I missed it please let me know and I'll pretty much instantly make a purchase of this fine software!
Cheers
Will
Hi Will,
A) We are presently not showing the context databases since MySQL is not providing that every time.
For 'Show processlist' sniffer 'db' is only available if somebody has selected, otherwise it will be NULL. You can check here,
http://dev.mysql.com/doc/refman/5.1/en/show-processlist.html. So it is not available in all the cases. And Proxy LUA is not providing
context database either. In slow query log, there is a 'Database' field available, but not every time.
B ) We will be implementing “Explain” on Queries for Query analyzer in next release(3.1).
Thanks for your insight.
manoharMemberHi Boyd,
We are currently discussing your suggestions. We will get back once we arrive at a decision, within couple of days.
manoharMemberHi,
We intentionally did not include these fields (host, user etc), since we thought that while identifying problematic queries, only query has to be taken into account irrespective of the host from which it has originated or the user who has queried it. So if Log Analyzer shows a query with count 3, it may have aggregated this information from different hosts/users, in this scenario showing host/user information doesn't make any sense. But we will give it a second thought after 3.0 goes for GA. We are recording your request. Thanks for reporting.
manoharMemberHi,
We are thinking about what we can do in this situation. We need some time to arrive at a decision. We will notify you about any updates. You can either inform us via forums or use our ticket system here http://webyog.com/support/ttx.cgi (if you need privacy ) for filing bugs.
manoharMembermanohar wrote on Dec 5 2008, 08:53 AM:Hi,1) Grouping of servers is already there in our to-do list. but the priority is not yet decided.
2) Connection details will be exposed in MONyog Object Model, we are planning a release by Wednesday next week (10th December 2008).
We will be exposing, server name, server-id, host-name, user-name and many more. And we are thinking of a workaround for grouping, we will come up with all the details once we have released next week
Thank you for your interest.
Unfortunately we won't be able to release tomorrow, we are planning for a release next week.
manoharMemberHi,
Thank you for your interest and encouraging words.
1) “Filtering options for the result displayed by log analyzer” is already there in our to-do list. priority is not yet decided.
2) Your suggestions regarding the GUI are added to our to-do list. We will discuss this.
manoharMemberosolo wrote on Dec 1 2008, 09:29 PM:That would be great! I'm finding more and more that global settings just don't work for me because each database instance is unique.I think your scripting engine is impressive, but I'm hoping you'll find a way to make something as fundamental as this integrated into a UI so that I don't have to constantly mess around with scripts.
Hi,
1) Grouping of servers is already there in our to-do list. but the priority is not yet decided.
2) Connection details will be exposed in MONyog Object Model, we are planning a release by Wednesday next week (10th December 2008).
We will be exposing, server name, server-id, host-name, user-name and many more. And we are thinking of a workaround for grouping, we will come up with all the details once we have released next week
Thank you for your interest.
manoharMemberzulf wrote on Jul 29 2008, 02:39 PM:I would ideally like to see summerized output (something that is different from opening the log using vi editor) like statement text, count of how many times this statement was run, and the last time stamp when this statement was run.We have similar request from other users, they require first and last occurrence time of the queries. We already have this in our to-do list.
thanks for your views. And the issue with Analyzing large log file with “All and a time stamp” filter setting is confirmed and we are working on the fix.
manoharMemberzulf wrote on Jul 28 2008, 04:19 PM:where do i set the log chunk size? is it the reading limit – read last/all?Yes, you can either specify Last “x” bytes/KB/MB or All if you choose to scan the entire file.
Are you using the default value (Last 1 M:cool:?
manoharMemberHi,
Yes, it is possible to analyze your query log in Linux machine from windows machine. All you have to do is
1) Go to “SSH server details” section in connection settings.
2) Do you want to use SSH? –> choose yes.
3) Choose “Linux” as OS of the host
4) Enter SSH details. (SSH user should have permission for SFTP access)
5) Go to “Log Analyzer Settings” and choose “Via SFTP”
6) Enter the log file path
Now you are ready for Analyzing, just go to Log Analyzer page and select appropriate connection for which you have entered log file path and analyze.
manoharMemberLet's start here.
Unzip the attached file. A folder with an .exe inside is resulting.
Just doubleclick the .exe
It will build two files in that folder “check.txt” and “log.txt”, and it will attempt to copy “check.txt” to … {AppData}SQlyog folder.
Please
1) attach “log.txt” Generated here.
-
AuthorPosts