You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Krishna <re...@gmail.com> on 2014/09/27 05:54:12 UTC

Commit exception with create index

Hi,

I'm running into following error when running create index statement.

CREATE INDEX idx_name ON table_name (COL1, COL2) INCLUDE (val)
DEFAULT_COLUMN_FAMILY='cf', DATA_BLOCK_ENCODING='FAST_DIFF', VERSIONS=1,
COMPRESSION='GZ';

Error: org.apache.phoenix.execute.CommitException:
java.util.concurrent.RejectedExecutionException: Task
org.apache.phoenix.job.JobManager$JobFutureTask@670fbc88 rejected from
org.apache.phoenix.job.JobManager$1@421d9604[Running, pool size = 128,
active threads = 128, queued tasks = 500, completed tasks = 99632]
(state=08000,code=101)

There are no more errors on sqline. Are there any other logs that I can
check?

Thanks

Re: Commit exception with create index

Posted by James Taylor <ja...@apache.org>.
See http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ThreadPoolExecutor.html
and more specifically bounded queues. This is the size of the bounded
queue for the thread pool on the client-side.
Thanks,
James

On Fri, Sep 26, 2014 at 9:04 PM, Krishna <re...@gmail.com> wrote:
> Hi James,
>
> I'm using Phoenix 3.1 running on HBase 0.94.18.
> Could you share how queueSize be estimated?
>
> Thanks
>
> On Fri, Sep 26, 2014 at 8:58 PM, James Taylor <ja...@apache.org>
> wrote:
>>
>> Hi Krishna,
>> Which version of Phoenix and HBase are you running? This exception
>> means that the thread pool on the client side is full (i.e. the queue
>> of the thread executor is full). You can try increasing the thread
>> pool size through the phoenix.query.queueSize config param as
>> documented here: http://phoenix.apache.org/tuning.html
>>
>> Thanks,
>> James
>>
>> On Fri, Sep 26, 2014 at 8:54 PM, Krishna <re...@gmail.com> wrote:
>> > Hi,
>> >
>> > I'm running into following error when running create index statement.
>> >
>> > CREATE INDEX idx_name ON table_name (COL1, COL2) INCLUDE (val)
>> > DEFAULT_COLUMN_FAMILY='cf', DATA_BLOCK_ENCODING='FAST_DIFF', VERSIONS=1,
>> > COMPRESSION='GZ';
>> >
>> > Error: org.apache.phoenix.execute.CommitException:
>> > java.util.concurrent.RejectedExecutionException: Task
>> > org.apache.phoenix.job.JobManager$JobFutureTask@670fbc88 rejected from
>> > org.apache.phoenix.job.JobManager$1@421d9604[Running, pool size = 128,
>> > active threads = 128, queued tasks = 500, completed tasks = 99632]
>> > (state=08000,code=101)
>> >
>> > There are no more errors on sqline. Are there any other logs that I can
>> > check?
>> >
>> > Thanks
>
>

Re: Commit exception with create index

Posted by Krishna <re...@gmail.com>.
Hi James,

I'm using Phoenix 3.1 running on HBase 0.94.18.
Could you share how queueSize be estimated?

Thanks

On Fri, Sep 26, 2014 at 8:58 PM, James Taylor <ja...@apache.org>
wrote:

> Hi Krishna,
> Which version of Phoenix and HBase are you running? This exception
> means that the thread pool on the client side is full (i.e. the queue
> of the thread executor is full). You can try increasing the thread
> pool size through the phoenix.query.queueSize config param as
> documented here: http://phoenix.apache.org/tuning.html
>
> Thanks,
> James
>
> On Fri, Sep 26, 2014 at 8:54 PM, Krishna <re...@gmail.com> wrote:
> > Hi,
> >
> > I'm running into following error when running create index statement.
> >
> > CREATE INDEX idx_name ON table_name (COL1, COL2) INCLUDE (val)
> > DEFAULT_COLUMN_FAMILY='cf', DATA_BLOCK_ENCODING='FAST_DIFF', VERSIONS=1,
> > COMPRESSION='GZ';
> >
> > Error: org.apache.phoenix.execute.CommitException:
> > java.util.concurrent.RejectedExecutionException: Task
> > org.apache.phoenix.job.JobManager$JobFutureTask@670fbc88 rejected from
> > org.apache.phoenix.job.JobManager$1@421d9604[Running, pool size = 128,
> > active threads = 128, queued tasks = 500, completed tasks = 99632]
> > (state=08000,code=101)
> >
> > There are no more errors on sqline. Are there any other logs that I can
> > check?
> >
> > Thanks
>

Re: Commit exception with create index

Posted by James Taylor <ja...@apache.org>.
Hi Krishna,
Which version of Phoenix and HBase are you running? This exception
means that the thread pool on the client side is full (i.e. the queue
of the thread executor is full). You can try increasing the thread
pool size through the phoenix.query.queueSize config param as
documented here: http://phoenix.apache.org/tuning.html

Thanks,
James

On Fri, Sep 26, 2014 at 8:54 PM, Krishna <re...@gmail.com> wrote:
> Hi,
>
> I'm running into following error when running create index statement.
>
> CREATE INDEX idx_name ON table_name (COL1, COL2) INCLUDE (val)
> DEFAULT_COLUMN_FAMILY='cf', DATA_BLOCK_ENCODING='FAST_DIFF', VERSIONS=1,
> COMPRESSION='GZ';
>
> Error: org.apache.phoenix.execute.CommitException:
> java.util.concurrent.RejectedExecutionException: Task
> org.apache.phoenix.job.JobManager$JobFutureTask@670fbc88 rejected from
> org.apache.phoenix.job.JobManager$1@421d9604[Running, pool size = 128,
> active threads = 128, queued tasks = 500, completed tasks = 99632]
> (state=08000,code=101)
>
> There are no more errors on sqline. Are there any other logs that I can
> check?
>
> Thanks