You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Gabor Makrai <ma...@gmail.com> on 2013/02/04 11:44:48 UTC

Problem with Hive JDBC server

Hi guys,

I'm writing you because I experienced a very strange problem which probably
affects all Hive distribution.
I made a small "only main function" Java program where I'm only connecting
to my Hive JDBC, and getting the list of the database tables (LIST TABLES)
and closing the ResultSet, the Statement and the Connection and doing this
a 1000 times. The problem is that the running Hive JDBC server does not
release files and with time it will throw Exception because, it will get
"Too many open files" IOException from the JVM.

I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in
CDH4.1.1.

If it is a know issue, than could you tell me the solution for it? If it is
not, than I can create a new ticket in Jira, and with a little help, I
probably can fix the problem and contribute the solution for it.

Thanks,
Gabor

Re: Re:RE: Problem with Hive JDBC server

Posted by Prasad Mujumdar <pr...@cloudera.com>.
   hmm .. there's a possibility that the execute and close for a given
statement are not handled by same thrift thread. Can you please verify the
testcase with a single worker thread on the server side ?
You can run the server using following :
hive --service hiveserver --maxWorkerThreads 1 --minWorkerThreads 1

thanks
Prasad

On Wed, Feb 6, 2013 at 3:44 AM, Gabor Makrai <ma...@gmail.com> wrote:

> Hi guys,
>
> Bad news for me. I checked out and compiled the Hive trunk and got the
> same problem.
> I attached to output of command lsof before and after my test program with
> 100 "SHOW TABLES" iterations. Is there any explanation why my JDBC server
> process doesn't release those files?
>
> Thanks,
> Gabor
>
>
> On Tue, Feb 5, 2013 at 6:20 AM, 王锋 <wf...@163.com> wrote:
>
>>
>>
>> I got it. pls see  https://issues.apache.org/jira/browse/THRIFT-1205
>>
>> I upgrade the thrift to libthrift-0.9.0.
>>
>> thanks
>>
>>
>>
>>
>> At 2013-02-05 13:06:05,"王锋" <wf...@163.com> wrote:
>>
>>
>> When I was using hiveserver ,the exception was thrown:
>>
>> 2060198 Hive history
>> file=/tmp/hdfs/hive_job_log_hdfs_201302010032_1918750748.txt
>> 2060199 Exception in thread "pool-1-thread-95"
>> java.lang.OutOfMemoryError: Java heap space
>> 2060200     at
>> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
>> 2060201     at
>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
>> 2060202     at
>> org.apache.hadoop.hive.service.ThriftHive$Processor.process(ThriftHive.java:730)
>> 2060203     at
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253)
>> 2060204     at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
>> 2060205     at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
>> 2060206     at java.lang.Thread.run(Thread.java:722)
>>
>> I using Hive-0.7.1-cdh3u1 with thrift-0.5.0.jar and
>> thrift-fb303-0.5.0.jar.
>> how can it be fixed? how about hive-0.7-1 using thrift -0.9.0?  thanks.
>>
>>
>>
>>
>> At 2013-02-04 19:19:16,"Bennie Schut" <bs...@ebuddy.com> wrote:
>>
>> Looking at the versions you might be hitting
>> https://issues.apache.org/jira/browse/HIVE-3481 which is fixed in 0.10***
>> *
>>
>> ** **
>>
>> On my dev machine the test runs with success :Running time: 298.952409914
>> ****
>>
>> This includes this patch so it’s worth looking at.****
>>
>> ** **
>>
>> *From:* Gabor Makrai [mailto:makrai.list@gmail.com]
>> *Sent:* Monday, February 04, 2013 11:58 AM
>> *To:* user@hive.apache.org
>> *Subject:* Re: Problem with Hive JDBC server****
>>
>> ** **
>>
>> Yes, of course! I attached the code!****
>>
>> ** **
>>
>> On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <ma...@gmail.com>
>> wrote:****
>>
>> Yes, of course! :) I attached the code!****
>>
>> ** **
>>
>> On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <bs...@ebuddy.com> wrote:*
>> ***
>>
>> Since it’s small can you post the code?****
>>
>>  ****
>>
>> *From:* Gabor Makrai [mailto:makrai.list@gmail.com]
>> *Sent:* Monday, February 04, 2013 11:45 AM
>> *To:* user@hive.apache.org
>> *Subject:* Problem with Hive JDBC server****
>>
>>  ****
>>
>> Hi guys,****
>>
>>  ****
>>
>> I'm writing you because I experienced a very strange problem which
>> probably affects all Hive distribution.****
>>
>> I made a small "only main function" Java program where I'm only
>> connecting to my Hive JDBC, and getting the list of the database tables
>> (LIST TABLES) and closing the ResultSet, the Statement and the Connection
>> and doing this a 1000 times. The problem is that the running Hive JDBC
>> server does not release files and with time it will throw Exception
>> because, it will get "Too many open files" IOException from the JVM.****
>>
>>  ****
>>
>> I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in
>> CDH4.1.1.****
>>
>>  ****
>>
>> If it is a know issue, than could you tell me the solution for it? If it
>> is not, than I can create a new ticket in Jira, and with a little help, I
>> probably can fix the problem and contribute the solution for it.****
>>
>>  ****
>>
>> Thanks,****
>>
>> Gabor****
>>
>> ** **
>>
>> ** **
>>
>>
>>
>>
>>
>>
>

RE: Re:RE: Problem with Hive JDBC server

Posted by Bennie Schut <bs...@ebuddy.com>.
What jdbc driver are you using? Also compiled from trunk? I ask because I remember a jira a while back where the jdbc driver didn’t let the server know the connection should be closed ().
If that’s the case updating the jdbc driver could work. However that might be a bit of a long shot.

From: Gabor Makrai [mailto:makrai.list@gmail.com]
Sent: Wednesday, February 06, 2013 12:45 PM
To: 王锋; Bennie Schut
Cc: user@hive.apache.org
Subject: Re: Re:RE: Problem with Hive JDBC server

Hi guys,

Bad news for me. I checked out and compiled the Hive trunk and got the same problem.
I attached to output of command lsof before and after my test program with 100 "SHOW TABLES" iterations. Is there any explanation why my JDBC server process doesn't release those files?

Thanks,
Gabor

On Tue, Feb 5, 2013 at 6:20 AM, 王锋 <wf...@163.com>> wrote:


I got it. pls see  https://issues.apache.org/jira/browse/THRIFT-1205

I upgrade the thrift to libthrift-0.9.0.

thanks



At 2013-02-05 13:06:05,"王锋" <wf...@163.com>> wrote:

When I was using hiveserver ,the exception was thrown:

2060198 Hive history file=/tmp/hdfs/hive_job_log_hdfs_201302010032_1918750748.txt
2060199 Exception in thread "pool-1-thread-95" java.lang.OutOfMemoryError: Java heap space
2060200     at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
2060201     at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
2060202     at org.apache.hadoop.hive.service.ThriftHive$Processor.process(ThriftHive.java:730)
2060203     at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253)
2060204     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
2060205     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
2060206     at java.lang.Thread.run(Thread.java:722)

I using Hive-0.7.1-cdh3u1 with thrift-0.5.0.jar and thrift-fb303-0.5.0.jar.
how can it be fixed? how about hive-0.7-1 using thrift -0.9.0?  thanks.



At 2013-02-04 19:19:16,"Bennie Schut" <bs...@ebuddy.com>> wrote:
Looking at the versions you might be hitting https://issues.apache.org/jira/browse/HIVE-3481 which is fixed in 0.10

On my dev machine the test runs with success :Running time: 298.952409914
This includes this patch so it’s worth looking at.

From: Gabor Makrai [mailto:makrai.list@gmail.com<ma...@gmail.com>]
Sent: Monday, February 04, 2013 11:58 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Problem with Hive JDBC server

Yes, of course! I attached the code!

On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <ma...@gmail.com>> wrote:
Yes, of course! :) I attached the code!

On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <bs...@ebuddy.com>> wrote:
Since it’s small can you post the code?

From: Gabor Makrai [mailto:makrai.list@gmail.com<ma...@gmail.com>]
Sent: Monday, February 04, 2013 11:45 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Problem with Hive JDBC server

Hi guys,

I'm writing you because I experienced a very strange problem which probably affects all Hive distribution.
I made a small "only main function" Java program where I'm only connecting to my Hive JDBC, and getting the list of the database tables (LIST TABLES) and closing the ResultSet, the Statement and the Connection and doing this a 1000 times. The problem is that the running Hive JDBC server does not release files and with time it will throw Exception because, it will get "Too many open files" IOException from the JVM.

I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in CDH4.1.1.

If it is a know issue, than could you tell me the solution for it? If it is not, than I can create a new ticket in Jira, and with a little help, I probably can fix the problem and contribute the solution for it.

Thanks,
Gabor






Re: Re:RE: Problem with Hive JDBC server

Posted by Gabor Makrai <ma...@gmail.com>.
Hi guys,

Bad news for me. I checked out and compiled the Hive trunk and got the same
problem.
I attached to output of command lsof before and after my test program with
100 "SHOW TABLES" iterations. Is there any explanation why my JDBC server
process doesn't release those files?

Thanks,
Gabor


On Tue, Feb 5, 2013 at 6:20 AM, ���� <wf...@163.com> wrote:

>
>
> I got it. pls see  https://issues.apache.org/jira/browse/THRIFT-1205
>
> I upgrade the thrift to libthrift-0.9.0.
>
> thanks
>
>
>
>
> At 2013-02-05 13:06:05,"����" <wf...@163.com> wrote:
>
>
> When I was using hiveserver ,the exception was thrown:
>
> 2060198 Hive history
> file=/tmp/hdfs/hive_job_log_hdfs_201302010032_1918750748.txt
> 2060199 Exception in thread "pool-1-thread-95" java.lang.OutOfMemoryError:
> Java heap space
> 2060200     at
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
> 2060201     at
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
> 2060202     at
> org.apache.hadoop.hive.service.ThriftHive$Processor.process(ThriftHive.java:730)
> 2060203     at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253)
> 2060204     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
> 2060205     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
> 2060206     at java.lang.Thread.run(Thread.java:722)
>
> I using Hive-0.7.1-cdh3u1 with thrift-0.5.0.jar and thrift-fb303-0.5.0.jar.
> how can it be fixed? how about hive-0.7-1 using thrift -0.9.0?  thanks.
>
>
>
>
> At 2013-02-04 19:19:16,"Bennie Schut" <bs...@ebuddy.com> wrote:
>
> Looking at the versions you might be hitting
> https://issues.apache.org/jira/browse/HIVE-3481 which is fixed in 0.10****
>
> ** **
>
> On my dev machine the test runs with success :Running time: 298.952409914*
> ***
>
> This includes this patch so it��s worth looking at.****
>
> ** **
>
> *From:* Gabor Makrai [mailto:makrai.list@gmail.com]
> *Sent:* Monday, February 04, 2013 11:58 AM
> *To:* user@hive.apache.org
> *Subject:* Re: Problem with Hive JDBC server****
>
> ** **
>
> Yes, of course! I attached the code!****
>
> ** **
>
> On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <ma...@gmail.com>
> wrote:****
>
> Yes, of course! :) I attached the code!****
>
> ** **
>
> On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <bs...@ebuddy.com> wrote:**
> **
>
> Since it��s small can you post the code?****
>
>  ****
>
> *From:* Gabor Makrai [mailto:makrai.list@gmail.com]
> *Sent:* Monday, February 04, 2013 11:45 AM
> *To:* user@hive.apache.org
> *Subject:* Problem with Hive JDBC server****
>
>  ****
>
> Hi guys,****
>
>  ****
>
> I'm writing you because I experienced a very strange problem which
> probably affects all Hive distribution.****
>
> I made a small "only main function" Java program where I'm only connecting
> to my Hive JDBC, and getting the list of the database tables (LIST TABLES)
> and closing the ResultSet, the Statement and the Connection and doing this
> a 1000 times. The problem is that the running Hive JDBC server does not
> release files and with time it will throw Exception because, it will get
> "Too many open files" IOException from the JVM.****
>
>  ****
>
> I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in
> CDH4.1.1.****
>
>  ****
>
> If it is a know issue, than could you tell me the solution for it? If it
> is not, than I can create a new ticket in Jira, and with a little help, I
> probably can fix the problem and contribute the solution for it.****
>
>  ****
>
> Thanks,****
>
> Gabor****
>
> ** **
>
> ** **
>
>
>
>
>
>

Re:Re:RE: Problem with Hive JDBC server

Posted by 王锋 <wf...@163.com>.

I got it. pls see  https://issues.apache.org/jira/browse/THRIFT-1205


I upgrade the thrift to libthrift-0.9.0.


thanks




At 2013-02-05 13:06:05,"王锋" <wf...@163.com> wrote:


When I was using hiveserver ,the exception was thrown:


2060198 Hive history file=/tmp/hdfs/hive_job_log_hdfs_201302010032_1918750748.txt
2060199 Exception in thread "pool-1-thread-95" java.lang.OutOfMemoryError: Java heap space
2060200     at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
2060201     at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
2060202     at org.apache.hadoop.hive.service.ThriftHive$Processor.process(ThriftHive.java:730)
2060203     at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253)
2060204     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
2060205     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
2060206     at java.lang.Thread.run(Thread.java:722)


I using Hive-0.7.1-cdh3u1 with thrift-0.5.0.jar and thrift-fb303-0.5.0.jar.
how can it be fixed? how about hive-0.7-1 using thrift -0.9.0?  thanks. 





At 2013-02-04 19:19:16,"Bennie Schut" <bs...@ebuddy.com> wrote:


Looking at the versions you might be hitting https://issues.apache.org/jira/browse/HIVE-3481 which is fixed in 0.10

 

On my dev machine the test runs with success :Running time: 298.952409914

This includes this patch so it’s worth looking at.

 

From: Gabor Makrai [mailto:makrai.list@gmail.com]
Sent: Monday, February 04, 2013 11:58 AM
To:user@hive.apache.org
Subject: Re: Problem with Hive JDBC server

 

Yes, of course! I attached the code!

 

On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <ma...@gmail.com> wrote:

Yes, of course! :) I attached the code!

 

On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <bs...@ebuddy.com> wrote:

Since it’s small can you post the code?

 

From: Gabor Makrai [mailto:makrai.list@gmail.com]
Sent: Monday, February 04, 2013 11:45 AM
To:user@hive.apache.org
Subject: Problem with Hive JDBC server

 

Hi guys,

 

I'm writing you because I experienced a very strange problem which probably affects all Hive distribution.

I made a small "only main function" Java program where I'm only connecting to my Hive JDBC, and getting the list of the database tables (LIST TABLES) and closing the ResultSet, the Statement and the Connection and doing this a 1000 times. The problem is that the running Hive JDBC server does not release files and with time it will throw Exception because, it will get "Too many open files" IOException from the JVM.

 

I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in CDH4.1.1.

 

If it is a know issue, than could you tell me the solution for it? If it is not, than I can create a new ticket in Jira, and with a little help, I probably can fix the problem and contribute the solution for it.

 

Thanks,

Gabor

 

 




Re:RE: Problem with Hive JDBC server

Posted by 王锋 <wf...@163.com>.
When I was using hiveserver ,the exception was thrown:


2060198 Hive history file=/tmp/hdfs/hive_job_log_hdfs_201302010032_1918750748.txt
2060199 Exception in thread "pool-1-thread-95" java.lang.OutOfMemoryError: Java heap space
2060200     at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
2060201     at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
2060202     at org.apache.hadoop.hive.service.ThriftHive$Processor.process(ThriftHive.java:730)
2060203     at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253)
2060204     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
2060205     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
2060206     at java.lang.Thread.run(Thread.java:722)


I using Hive-0.7.1-cdh3u1 with thrift-0.5.0.jar and thrift-fb303-0.5.0.jar.
how can it be fixed? how about hive-0.7-1 using thrift -0.9.0?  thanks. 





At 2013-02-04 19:19:16,"Bennie Schut" <bs...@ebuddy.com> wrote:


Looking at the versions you might be hitting https://issues.apache.org/jira/browse/HIVE-3481 which is fixed in 0.10

 

On my dev machine the test runs with success :Running time: 298.952409914

This includes this patch so it’s worth looking at.

 

From: Gabor Makrai [mailto:makrai.list@gmail.com]
Sent: Monday, February 04, 2013 11:58 AM
To:user@hive.apache.org
Subject: Re: Problem with Hive JDBC server

 

Yes, of course! I attached the code!

 

On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <ma...@gmail.com> wrote:

Yes, of course! :) I attached the code!

 

On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <bs...@ebuddy.com> wrote:

Since it’s small can you post the code?

 

From: Gabor Makrai [mailto:makrai.list@gmail.com]
Sent: Monday, February 04, 2013 11:45 AM
To:user@hive.apache.org
Subject: Problem with Hive JDBC server

 

Hi guys,

 

I'm writing you because I experienced a very strange problem which probably affects all Hive distribution.

I made a small "only main function" Java program where I'm only connecting to my Hive JDBC, and getting the list of the database tables (LIST TABLES) and closing the ResultSet, the Statement and the Connection and doing this a 1000 times. The problem is that the running Hive JDBC server does not release files and with time it will throw Exception because, it will get "Too many open files" IOException from the JVM.

 

I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in CDH4.1.1.

 

If it is a know issue, than could you tell me the solution for it? If it is not, than I can create a new ticket in Jira, and with a little help, I probably can fix the problem and contribute the solution for it.

 

Thanks,

Gabor

 

 

RE: Problem with Hive JDBC server

Posted by Bennie Schut <bs...@ebuddy.com>.
Looking at the versions you might be hitting https://issues.apache.org/jira/browse/HIVE-3481 which is fixed in 0.10

On my dev machine the test runs with success :Running time: 298.952409914
This includes this patch so it's worth looking at.

From: Gabor Makrai [mailto:makrai.list@gmail.com]
Sent: Monday, February 04, 2013 11:58 AM
To: user@hive.apache.org
Subject: Re: Problem with Hive JDBC server

Yes, of course! I attached the code!

On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <ma...@gmail.com>> wrote:
Yes, of course! :) I attached the code!

On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <bs...@ebuddy.com>> wrote:
Since it's small can you post the code?

From: Gabor Makrai [mailto:makrai.list@gmail.com<ma...@gmail.com>]
Sent: Monday, February 04, 2013 11:45 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Problem with Hive JDBC server

Hi guys,

I'm writing you because I experienced a very strange problem which probably affects all Hive distribution.
I made a small "only main function" Java program where I'm only connecting to my Hive JDBC, and getting the list of the database tables (LIST TABLES) and closing the ResultSet, the Statement and the Connection and doing this a 1000 times. The problem is that the running Hive JDBC server does not release files and with time it will throw Exception because, it will get "Too many open files" IOException from the JVM.

I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in CDH4.1.1.

If it is a know issue, than could you tell me the solution for it? If it is not, than I can create a new ticket in Jira, and with a little help, I probably can fix the problem and contribute the solution for it.

Thanks,
Gabor



Re: Problem with Hive JDBC server

Posted by Gabor Makrai <ma...@gmail.com>.
Yes, of course! I attached the code!


On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <ma...@gmail.com> wrote:

> Yes, of course! :) I attached the code!
>
>
> On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <bs...@ebuddy.com> wrote:
>
>> Since it’s small can you post the code?****
>>
>> ** **
>>
>> *From:* Gabor Makrai [mailto:makrai.list@gmail.com]
>> *Sent:* Monday, February 04, 2013 11:45 AM
>> *To:* user@hive.apache.org
>> *Subject:* Problem with Hive JDBC server****
>>
>> ** **
>>
>> Hi guys,****
>>
>> ** **
>>
>> I'm writing you because I experienced a very strange problem which
>> probably affects all Hive distribution.****
>>
>> I made a small "only main function" Java program where I'm only
>> connecting to my Hive JDBC, and getting the list of the database tables
>> (LIST TABLES) and closing the ResultSet, the Statement and the Connection
>> and doing this a 1000 times. The problem is that the running Hive JDBC
>> server does not release files and with time it will throw Exception
>> because, it will get "Too many open files" IOException from the JVM.****
>>
>> ** **
>>
>> I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in
>> CDH4.1.1.****
>>
>> ** **
>>
>> If it is a know issue, than could you tell me the solution for it? If it
>> is not, than I can create a new ticket in Jira, and with a little help, I
>> probably can fix the problem and contribute the solution for it.****
>>
>> ** **
>>
>> Thanks,****
>>
>> Gabor****
>>
>
>

RE: Problem with Hive JDBC server

Posted by Bennie Schut <bs...@ebuddy.com>.
Since it's small can you post the code?

From: Gabor Makrai [mailto:makrai.list@gmail.com]
Sent: Monday, February 04, 2013 11:45 AM
To: user@hive.apache.org
Subject: Problem with Hive JDBC server

Hi guys,

I'm writing you because I experienced a very strange problem which probably affects all Hive distribution.
I made a small "only main function" Java program where I'm only connecting to my Hive JDBC, and getting the list of the database tables (LIST TABLES) and closing the ResultSet, the Statement and the Connection and doing this a 1000 times. The problem is that the running Hive JDBC server does not release files and with time it will throw Exception because, it will get "Too many open files" IOException from the JVM.

I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in CDH4.1.1.

If it is a know issue, than could you tell me the solution for it? If it is not, than I can create a new ticket in Jira, and with a little help, I probably can fix the problem and contribute the solution for it.

Thanks,
Gabor