I have established a database connection with MySQL 5.0 database in Netbeans services tab and using right-click->View
Data tried to see information in the table. It appeared that it was a large table (17+ millions rows) and NB did not
try to do pagination of it. After 7-8 minutes I tried to kill the query but after that Netbeans became unresponsive and
had to be killed via task manager. I tried it again on some other tables and while it behaved as expected on smaller
tables (few thousands records) it hung again on another large table (6.2 millions rows).
The bug is annoying but it's not a stopper. Anyway, I agree that the issue must be fixed in patch.
I just tried this on an Oracle database with 30+ million rows, and it worked fine. First of all, it did take a bit (10
seconds) before the prompt came back saying "got 200 records, do you want more?". But there is also a progress bar on
the bottom saying "Executing SQL statement" and if you click on the little 'x' it will cancel the query (asking you
first if you are sure).
Gary: are you getting the progress bar?
I just tried something else - changed my query to do a LIKE clause on an non-indexed column. This of course takes
forever, and when I tried to cancel the query, it asked me if I was sure, I said 'yes', and it took a while for it to
take effect, but it did cancel the query.
So this all looks good to me...
Another test case - a 30 million record database in MySQL on my local machine. This table has no index, and it seems to
be giving MySQL a hard time.
NetBeans is not dead but is pretty unresponsive, and an attempt to cancel the SQL execution thread from the progress bar
so far has had no effect (after about 30 seconds).
My CPU is pegged at 100%, and the culprit is NetBeans.
There, after about 1 minute, NetBeans comes back with an exception:
ava.io.EOFException: Can not read response from server. Expected to read 4 bytes, read 0 bytes before connection was
Caused: com.mysql.jdbc.CommunicationsException: Communications link failure
Last packet sent to the server was 22 ms ago.
Further investigation showed that MySQL crashed. Ouch.
So I can see doing these big queries causing real problems for the server. So what can we do to be good citizens with
Let me think about this some more...
Also, I want to figure out why NetBeans is pegging the CPU...
Gary, can you please tell me what database you are using. From your email on nbusers it sounds like MySQL, is that correct?
It tunrs out MySQL will try to read in *all* the rows into the client before it gives you a single row, even if you set
the fetch size to say 1 row. See http://lists.mysql.com/java/9136.
The suggested solution is to add these properties to your URL:
for example: "jdbc:mysql://localhost:1527/mysql?useServerPrepStmts=true&useCursorFetch=true"
However, I tried this, and it didn't work, just as it didn't for the user in the mysql thread I link to above.
What *did* work was to set the fetch size to Integer.MIN_VALUE, as Mark Matthews suggested. Oh, that's portable :)
But now this introduces another problem. MySQL can't handle having two threads doing work on the same connection. So
if one connection has results open, any operations on another thread block. This is *why* they read in all rows when
you issue a select statement.
This directly impacts NB, because if I read in say 200 of the 30 million rows, and then I try to close the connection,
it doesn't take effect - the thread closing the connection hangs. After about three minutes it finally returns and
closes the connection.
When I try to open a separate command window, it also fails with this error:
"Error code 0, SQL state null: Streaming result set com.mysql.jdbc.RowDataDynamic@361de2 is still active. No statements
may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called
.close() on any active streaming result sets before attempting more queries."
So you can see it's a real bind. If I enable streaming using setFetchSize(Integer.MIN_VALUE), we run into this problem.
If I disable streaming, then you get these nasty hangs and OutOfMemory exceptions when trying to read large tables.
And of course JDBC doesn't let me ask how big the result will be before I execute it.
Let me think about the best way to solve this...
Another approach that is working is to use statement.setMaxRows(). I just set it to 200,000, way more than anyone would
look at in our tool, and it fixed the hanging/out of memory problem, without the resulting locking issues.
This seems like a reasonable fix. I'll look at adding a property that lets you increase the max rows to a larger size
if for some crazy reason somebody needs that.
On my previous job I was working with Oracle and Netbeans 5.5 and everything was OK.
Now I am working with MySQL and having troubles with big tables, although all my tables are indexed.
I just run another query with about 0.5 million records in it and it was very slow.
I am getting progress bar but clicking on x near it has no effect. I waited over 5 mins and quit NB.
It did warn me at that point that query will be terminated, so I assume it hasn't been terminated before.
I just found out that MySQL treats Statement.cancel() as a no-op, so the fact that you can't cancel a long-running query
is a defect of the driver and there really is nothing we can do about it. If the underlying driver doesn't catch a
cancel and holds onto the thread, there is no way in Java that we can kill the thread. You'll just have to exit
NetBeans to kill it.
We can work around this by setting max rows for queries that return large result sets. But if it's a long-running query
for other reasons (like it's doing a massive join or a huge table scan) there is really nothing much we can do until
MySQL fixes this issue.
I logged an issue for this, see http://bugs.mysql.com/bug.php?id=36332
OK, I checked in the fix to set maxrows to 200,000. You can modify this default value by creating the file
config/Preferences/org/netbeans/modules/db/core.properties and add the line
maxrows=<number of rows>
If you set it to 0, it means read in all rows available.
Note that there is no pagination, so if you want the 200,001th row, it won't be available. Not that I can imagine
anybody wanting to look at that many rows in NB.
I'm copying James as this may need a release note.
Here is the change set for this fix:
Note this doesn't fix the ability to cancel a query, for the reason I mentioned in the previous note. That just can't
get fixed without a change from the MySQL community.
To clarify, the path I mention for the properties file is under your userdir...
Please move the preferences code away from SQLExecuteHelper, it doesn't belong there. Introduce a SQLPreferences class
or similar and have the preferences code there. You don't need to pass maxRows into SQLExecuteHelper, just call
SQLPreferences from there.
It turns out that statement.cancel() does work, it just takes a verrry longgg time. So we might want to re-architect
the way we execute statements so that when they try to kill the process through the progress bar it calls
Verified with trunk build 080428.
The fix has been ported into the release61_fixes branch: