You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Parveen Jain <pa...@live.com> on 2016/08/02 07:09:29 UTC

Re: Phoenix Create table timeout

As for large tables my phoenix client is giving timeout, I was trying to create phoenix table using spark but it is also giving error.

Below is another link asked in stackoverflow for the same: http://stackoverflow.com/questions/38498096/create-table-in-phoenix-from-spark


<http://stackoverflow.com/questions/38498096/create-table-in-phoenix-from-spark>

Can anyone please suggest a wayout for creating a table where phoenix python client is giving timeout. I am more interested in knowing for creating a table using spark.
<http://stackoverflow.com/questions/38498096/create-table-in-phoenix-from-spark>

[http://cdn.sstatic.net/Sites/stackoverflow/img/apple-touch-icon@2.png?v=73d79a89bded&a]<http://stackoverflow.com/questions/38498096/create-table-in-phoenix-from-spark>

scala - create table in phoenix from spark - Stack Overflow<http://stackoverflow.com/questions/38498096/create-table-in-phoenix-from-spark>
stackoverflow.com
Hi I need to create a table in Phoenix from a spark job . I have tried 2 ways below but none of them work, seems this is still not supported. 1) Dataframe.write still ...



Regards,

Parveen Jain

________________________________
From: Parveen Jain <pa...@live.com>
Sent: Friday, July 29, 2016 1:46:10 PM
To: user
Subject: Re: Phoenix Create table timeout


Hi Simon,

You are right that table is available even after timeout but I m not fully sure whether that is a valid table or not for my queries?  Any way out for this or I can use that table for normal queries ?


James,

 Let me know whatever support is required for getting it fixed. I can help for reproducing it(can provide config files).


Thanks,

Parveen Jain

________________________________
From: James Taylor <ja...@apache.org>
Sent: Friday, July 29, 2016 4:39:03 AM
To: user
Subject: Re: Phoenix Create table timeout

Yes, a MR option would be great. I filed PHOENIX-3125 for this. Give our MR support, this would be a good initial contribution for someone.

Thanks,
James

On Thu, Jul 28, 2016 at 11:16 AM, Simon Wang <si...@airbnb.com>> wrote:
This isn’t a solution but I have encountered this problem before. It also seemed that the table becomes available in Phoenix even if the creation dies to an error. I am also interested in a workaround! (Maybe a MR job?

- Simon


On Jul 28, 2016, at 1:49 AM, Parveen Jain <pa...@live.com>> wrote:


Hi All,
 Just starting out with Phoenix and facing issues regarding time out.First Timeout I am getting is while creating table in phoenix which for an existing table in Hbase.

My HBase table has almost 100M rows and split across two nodes having two region servers. One server is octa core and another is 16 core.I am using only default settings of HBase and some time out settings for phoenix before starting it out. Can Anyone suggest from where I can start looking into ?

Regards,
Parveen Jain

Following is the error:
 CREATE TABLE "DATA_FOR_PHOENIX1"(pk VARCHAR PRIMARY KEY,"PARSED_DATA_COLUMN1".NAME0 VARCHAR,"PARSED_DATA_COLUMN1".NAME1 VARCHAR,"PARSED_DATA_COLUMN1".NAME2 VARCHAR,"VARCHAR,"PARSED_DATA_COLUMN1".NAME3 VARCHAR,"PARSED_DATA_COLUMN1".NAME4 VARCHAR,"PARSED_DATA_COLUMN1".NAME5 VARCHAR,"PARSED_DATA_COLUMN1".NAME6 VARCHAR) SALT_BUCKETS=16;

Error: Operation timed out (state=TIM01,code=6000)
java.sql.SQLTimeoutException: Operation timed out
        at org.apache.phoenix.exception.SQLExceptionCode$14.newException(SQLExceptionCode.java:332)
        at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
        at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:573)
        at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:562)
        at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:516)
        at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
        at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
        at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
        at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingResultIterator.java:64)
        at org.apache.phoenix.iterate.UngroupedAggregatingResultIterator.next(UngroupedAggregatingResultIterator.java:39)
        at org.apache.phoenix.compile.PostDDLCompiler$1.execute(PostDDLCompiler.java:228)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.updateData(ConnectionQueryServicesImpl.java:2013)
        at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:787)
        at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
        at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:305)
        at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:297)
        at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:295)
        at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1255)
        at sqlline.Commands.execute(Commands.java:822)
        at sqlline.Commands.sql(Commands.java:732)
        at sqlline.SqlLine.dispatch(SqlLine.java:808)
        at sqlline.SqlLine.begin(SqlLine.java:681)
        at sqlline.SqlLine.start(SqlLine.java:398)
        at sqlline.SqlLine.main(SqlLine.java:292)