You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Barak Yaish <ba...@gmail.com> on 2012/11/26 13:43:23 UTC

hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Hi,



We would like to integrate hive with hbase, so we are following the
instructions listed here https://cwiki.apache.org/Hive/hbaseintegration.html
.



Hadoop 1.0.4

Hbase 0.94.2

Hive 0.9.0



I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
hive and trying to create the table as described in the above link, the
following error is thrown:



hive> CREATE TABLE hbase_table_1(key int, value string)

    > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'

    > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")

    > TBLPROPERTIES ("hbase.table.name" = "xyz");

FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory

NestedThrowables:

java.lang.reflect.InvocationTargetException

FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask



Can anyone suggest what else need to be done in order to set up the
integration?



Thanks.

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Yep, derby-10.4.2.0.jar is there. Should I need add it to HADOOP_CP as well?

Hadoop 1.0.4

Hive 0.9.0

On Mon, Nov 26, 2012 at 3:29 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
> download it put it there. BTW, which version are you using?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>> I'm using the default metadata store, but indeed looks like my hive
>> doesn't feel that well:
>>
>> hive> CREATE TABLE pokes (foo INT, bar STRING);
>>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>> NestedThrowables:
>> java.lang.reflect.InvocationTargetException
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>> hive>
>>
>> Can you point me how to troubleshoot this issue, or its out of this list
>> scope?
>>
>> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Barak,
>>>
>>>      Are you using the default metadata store(derby) or tying to use
>>> something else like MySQL?If latter is the case, make sure you have the
>>> necessary connector in place. Also, is your Hive working fine independently?
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> We would like to integrate hive with hbase, so we are following the
>>>> instructions listed here
>>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>>
>>>>
>>>>
>>>> Hadoop 1.0.4
>>>>
>>>> Hbase 0.94.2
>>>>
>>>> Hive 0.9.0
>>>>
>>>>
>>>>
>>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When
>>>> running hive and trying to create the table as described in the above link,
>>>> the following error is thrown:
>>>>
>>>>
>>>>
>>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>>
>>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>>
>>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>>
>>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>>
>>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>>> creating transactional connection factory
>>>>
>>>> NestedThrowables:
>>>>
>>>> java.lang.reflect.InvocationTargetException
>>>>
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>
>>>>
>>>>
>>>> Can anyone suggest what else need to be done in order to set up the
>>>> integration?
>>>>
>>>>
>>>>
>>>> Thanks.
>>>>
>>>
>>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Yep, derby-10.4.2.0.jar is there. Should I need add it to HADOOP_CP as well?

Hadoop 1.0.4

Hive 0.9.0

On Mon, Nov 26, 2012 at 3:29 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
> download it put it there. BTW, which version are you using?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>> I'm using the default metadata store, but indeed looks like my hive
>> doesn't feel that well:
>>
>> hive> CREATE TABLE pokes (foo INT, bar STRING);
>>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>> NestedThrowables:
>> java.lang.reflect.InvocationTargetException
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>> hive>
>>
>> Can you point me how to troubleshoot this issue, or its out of this list
>> scope?
>>
>> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Barak,
>>>
>>>      Are you using the default metadata store(derby) or tying to use
>>> something else like MySQL?If latter is the case, make sure you have the
>>> necessary connector in place. Also, is your Hive working fine independently?
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> We would like to integrate hive with hbase, so we are following the
>>>> instructions listed here
>>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>>
>>>>
>>>>
>>>> Hadoop 1.0.4
>>>>
>>>> Hbase 0.94.2
>>>>
>>>> Hive 0.9.0
>>>>
>>>>
>>>>
>>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When
>>>> running hive and trying to create the table as described in the above link,
>>>> the following error is thrown:
>>>>
>>>>
>>>>
>>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>>
>>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>>
>>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>>
>>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>>
>>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>>> creating transactional connection factory
>>>>
>>>> NestedThrowables:
>>>>
>>>> java.lang.reflect.InvocationTargetException
>>>>
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>
>>>>
>>>>
>>>> Can anyone suggest what else need to be done in order to set up the
>>>> integration?
>>>>
>>>>
>>>>
>>>> Thanks.
>>>>
>>>
>>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Yep, derby-10.4.2.0.jar is there. Should I need add it to HADOOP_CP as well?

Hadoop 1.0.4

Hive 0.9.0

On Mon, Nov 26, 2012 at 3:29 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
> download it put it there. BTW, which version are you using?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>> I'm using the default metadata store, but indeed looks like my hive
>> doesn't feel that well:
>>
>> hive> CREATE TABLE pokes (foo INT, bar STRING);
>>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>> NestedThrowables:
>> java.lang.reflect.InvocationTargetException
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>> hive>
>>
>> Can you point me how to troubleshoot this issue, or its out of this list
>> scope?
>>
>> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Barak,
>>>
>>>      Are you using the default metadata store(derby) or tying to use
>>> something else like MySQL?If latter is the case, make sure you have the
>>> necessary connector in place. Also, is your Hive working fine independently?
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> We would like to integrate hive with hbase, so we are following the
>>>> instructions listed here
>>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>>
>>>>
>>>>
>>>> Hadoop 1.0.4
>>>>
>>>> Hbase 0.94.2
>>>>
>>>> Hive 0.9.0
>>>>
>>>>
>>>>
>>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When
>>>> running hive and trying to create the table as described in the above link,
>>>> the following error is thrown:
>>>>
>>>>
>>>>
>>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>>
>>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>>
>>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>>
>>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>>
>>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>>> creating transactional connection factory
>>>>
>>>> NestedThrowables:
>>>>
>>>> java.lang.reflect.InvocationTargetException
>>>>
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>
>>>>
>>>>
>>>> Can anyone suggest what else need to be done in order to set up the
>>>> integration?
>>>>
>>>>
>>>>
>>>> Thanks.
>>>>
>>>
>>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Yep, derby-10.4.2.0.jar is there. Should I need add it to HADOOP_CP as well?

Hadoop 1.0.4

Hive 0.9.0

On Mon, Nov 26, 2012 at 3:29 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
> download it put it there. BTW, which version are you using?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>> I'm using the default metadata store, but indeed looks like my hive
>> doesn't feel that well:
>>
>> hive> CREATE TABLE pokes (foo INT, bar STRING);
>>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>> NestedThrowables:
>> java.lang.reflect.InvocationTargetException
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>> hive>
>>
>> Can you point me how to troubleshoot this issue, or its out of this list
>> scope?
>>
>> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Hello Barak,
>>>
>>>      Are you using the default metadata store(derby) or tying to use
>>> something else like MySQL?If latter is the case, make sure you have the
>>> necessary connector in place. Also, is your Hive working fine independently?
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> We would like to integrate hive with hbase, so we are following the
>>>> instructions listed here
>>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>>
>>>>
>>>>
>>>> Hadoop 1.0.4
>>>>
>>>> Hbase 0.94.2
>>>>
>>>> Hive 0.9.0
>>>>
>>>>
>>>>
>>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When
>>>> running hive and trying to create the table as described in the above link,
>>>> the following error is thrown:
>>>>
>>>>
>>>>
>>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>>
>>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>>
>>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>>
>>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>>
>>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>>> creating transactional connection factory
>>>>
>>>> NestedThrowables:
>>>>
>>>> java.lang.reflect.InvocationTargetException
>>>>
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>
>>>>
>>>>
>>>> Can anyone suggest what else need to be done in order to set up the
>>>> integration?
>>>>
>>>>
>>>>
>>>> Thanks.
>>>>
>>>
>>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
download it put it there. BTW, which version are you using?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
> I'm using the default metadata store, but indeed looks like my hive
> doesn't feel that well:
>
> hive> CREATE TABLE pokes (foo INT, bar STRING);
>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> hive>
>
> Can you point me how to troubleshoot this issue, or its out of this list
> scope?
>
> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> Hello Barak,
>>
>>      Are you using the default metadata store(derby) or tying to use
>> something else like MySQL?If latter is the case, make sure you have the
>> necessary connector in place. Also, is your Hive working fine independently?
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> We would like to integrate hive with hbase, so we are following the
>>> instructions listed here
>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>
>>>
>>>
>>> Hadoop 1.0.4
>>>
>>> Hbase 0.94.2
>>>
>>> Hive 0.9.0
>>>
>>>
>>>
>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>>> hive and trying to create the table as described in the above link, the
>>> following error is thrown:
>>>
>>>
>>>
>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>
>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>
>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>
>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>
>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>> creating transactional connection factory
>>>
>>> NestedThrowables:
>>>
>>> java.lang.reflect.InvocationTargetException
>>>
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>
>>>
>>>
>>> Can anyone suggest what else need to be done in order to set up the
>>> integration?
>>>
>>>
>>>
>>> Thanks.
>>>
>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
download it put it there. BTW, which version are you using?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
> I'm using the default metadata store, but indeed looks like my hive
> doesn't feel that well:
>
> hive> CREATE TABLE pokes (foo INT, bar STRING);
>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> hive>
>
> Can you point me how to troubleshoot this issue, or its out of this list
> scope?
>
> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> Hello Barak,
>>
>>      Are you using the default metadata store(derby) or tying to use
>> something else like MySQL?If latter is the case, make sure you have the
>> necessary connector in place. Also, is your Hive working fine independently?
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> We would like to integrate hive with hbase, so we are following the
>>> instructions listed here
>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>
>>>
>>>
>>> Hadoop 1.0.4
>>>
>>> Hbase 0.94.2
>>>
>>> Hive 0.9.0
>>>
>>>
>>>
>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>>> hive and trying to create the table as described in the above link, the
>>> following error is thrown:
>>>
>>>
>>>
>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>
>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>
>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>
>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>
>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>> creating transactional connection factory
>>>
>>> NestedThrowables:
>>>
>>> java.lang.reflect.InvocationTargetException
>>>
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>
>>>
>>>
>>> Can anyone suggest what else need to be done in order to set up the
>>> integration?
>>>
>>>
>>>
>>> Thanks.
>>>
>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
download it put it there. BTW, which version are you using?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
> I'm using the default metadata store, but indeed looks like my hive
> doesn't feel that well:
>
> hive> CREATE TABLE pokes (foo INT, bar STRING);
>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> hive>
>
> Can you point me how to troubleshoot this issue, or its out of this list
> scope?
>
> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> Hello Barak,
>>
>>      Are you using the default metadata store(derby) or tying to use
>> something else like MySQL?If latter is the case, make sure you have the
>> necessary connector in place. Also, is your Hive working fine independently?
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> We would like to integrate hive with hbase, so we are following the
>>> instructions listed here
>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>
>>>
>>>
>>> Hadoop 1.0.4
>>>
>>> Hbase 0.94.2
>>>
>>> Hive 0.9.0
>>>
>>>
>>>
>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>>> hive and trying to create the table as described in the above link, the
>>> following error is thrown:
>>>
>>>
>>>
>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>
>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>
>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>
>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>
>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>> creating transactional connection factory
>>>
>>> NestedThrowables:
>>>
>>> java.lang.reflect.InvocationTargetException
>>>
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>
>>>
>>>
>>> Can anyone suggest what else need to be done in order to set up the
>>> integration?
>>>
>>>
>>>
>>> Thanks.
>>>
>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Do you have the derby-*jar in your $HIVE_HOME/lib directory?If not,
download it put it there. BTW, which version are you using?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:37 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
> I'm using the default metadata store, but indeed looks like my hive
> doesn't feel that well:
>
> hive> CREATE TABLE pokes (foo INT, bar STRING);
>  FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> hive>
>
> Can you point me how to troubleshoot this issue, or its out of this list
> scope?
>
> On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com>wrote:
>
>> Hello Barak,
>>
>>      Are you using the default metadata store(derby) or tying to use
>> something else like MySQL?If latter is the case, make sure you have the
>> necessary connector in place. Also, is your Hive working fine independently?
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> We would like to integrate hive with hbase, so we are following the
>>> instructions listed here
>>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>>
>>>
>>>
>>> Hadoop 1.0.4
>>>
>>> Hbase 0.94.2
>>>
>>> Hive 0.9.0
>>>
>>>
>>>
>>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>>> hive and trying to create the table as described in the above link, the
>>> following error is thrown:
>>>
>>>
>>>
>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>
>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>
>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>
>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>
>>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>>> creating transactional connection factory
>>>
>>> NestedThrowables:
>>>
>>> java.lang.reflect.InvocationTargetException
>>>
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>
>>>
>>>
>>> Can anyone suggest what else need to be done in order to set up the
>>> integration?
>>>
>>>
>>>
>>> Thanks.
>>>
>>
>>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Hi,

I'm using the default metadata store, but indeed looks like my hive doesn't
feel that well:

hive> CREATE TABLE pokes (foo INT, bar STRING);
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
hive>

Can you point me how to troubleshoot this issue, or its out of this list
scope?

On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Hello Barak,
>
>      Are you using the default metadata store(derby) or tying to use
> something else like MySQL?If latter is the case, make sure you have the
> necessary connector in place. Also, is your Hive working fine independently?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>>
>>
>> We would like to integrate hive with hbase, so we are following the
>> instructions listed here
>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>
>>
>>
>> Hadoop 1.0.4
>>
>> Hbase 0.94.2
>>
>> Hive 0.9.0
>>
>>
>>
>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>> hive and trying to create the table as described in the above link, the
>> following error is thrown:
>>
>>
>>
>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>
>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>
>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>
>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>
>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>>
>> NestedThrowables:
>>
>> java.lang.reflect.InvocationTargetException
>>
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>>
>>
>>
>> Can anyone suggest what else need to be done in order to set up the
>> integration?
>>
>>
>>
>> Thanks.
>>
>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Hi,

I'm using the default metadata store, but indeed looks like my hive doesn't
feel that well:

hive> CREATE TABLE pokes (foo INT, bar STRING);
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
hive>

Can you point me how to troubleshoot this issue, or its out of this list
scope?

On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Hello Barak,
>
>      Are you using the default metadata store(derby) or tying to use
> something else like MySQL?If latter is the case, make sure you have the
> necessary connector in place. Also, is your Hive working fine independently?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>>
>>
>> We would like to integrate hive with hbase, so we are following the
>> instructions listed here
>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>
>>
>>
>> Hadoop 1.0.4
>>
>> Hbase 0.94.2
>>
>> Hive 0.9.0
>>
>>
>>
>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>> hive and trying to create the table as described in the above link, the
>> following error is thrown:
>>
>>
>>
>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>
>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>
>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>
>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>
>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>>
>> NestedThrowables:
>>
>> java.lang.reflect.InvocationTargetException
>>
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>>
>>
>>
>> Can anyone suggest what else need to be done in order to set up the
>> integration?
>>
>>
>>
>> Thanks.
>>
>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Hi,

I'm using the default metadata store, but indeed looks like my hive doesn't
feel that well:

hive> CREATE TABLE pokes (foo INT, bar STRING);
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
hive>

Can you point me how to troubleshoot this issue, or its out of this list
scope?

On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Hello Barak,
>
>      Are you using the default metadata store(derby) or tying to use
> something else like MySQL?If latter is the case, make sure you have the
> necessary connector in place. Also, is your Hive working fine independently?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>>
>>
>> We would like to integrate hive with hbase, so we are following the
>> instructions listed here
>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>
>>
>>
>> Hadoop 1.0.4
>>
>> Hbase 0.94.2
>>
>> Hive 0.9.0
>>
>>
>>
>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>> hive and trying to create the table as described in the above link, the
>> following error is thrown:
>>
>>
>>
>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>
>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>
>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>
>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>
>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>>
>> NestedThrowables:
>>
>> java.lang.reflect.InvocationTargetException
>>
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>>
>>
>>
>> Can anyone suggest what else need to be done in order to set up the
>> integration?
>>
>>
>>
>> Thanks.
>>
>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Barak Yaish <ba...@gmail.com>.
Hi,

I'm using the default metadata store, but indeed looks like my hive doesn't
feel that well:

hive> CREATE TABLE pokes (foo INT, bar STRING);
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
hive>

Can you point me how to troubleshoot this issue, or its out of this list
scope?

On Mon, Nov 26, 2012 at 2:53 PM, Mohammad Tariq <do...@gmail.com> wrote:

> Hello Barak,
>
>      Are you using the default metadata store(derby) or tying to use
> something else like MySQL?If latter is the case, make sure you have the
> necessary connector in place. Also, is your Hive working fine independently?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com>wrote:
>
>> Hi,
>>
>>
>>
>> We would like to integrate hive with hbase, so we are following the
>> instructions listed here
>> https://cwiki.apache.org/Hive/hbaseintegration.html.
>>
>>
>>
>> Hadoop 1.0.4
>>
>> Hbase 0.94.2
>>
>> Hive 0.9.0
>>
>>
>>
>> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
>> hive and trying to create the table as described in the above link, the
>> following error is thrown:
>>
>>
>>
>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>
>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>
>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>
>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>
>> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
>> creating transactional connection factory
>>
>> NestedThrowables:
>>
>> java.lang.reflect.InvocationTargetException
>>
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>>
>>
>>
>> Can anyone suggest what else need to be done in order to set up the
>> integration?
>>
>>
>>
>> Thanks.
>>
>
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Barak,

     Are you using the default metadata store(derby) or tying to use
something else like MySQL?If latter is the case, make sure you have the
necessary connector in place. Also, is your Hive working fine independently?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
>
>
> We would like to integrate hive with hbase, so we are following the
> instructions listed here
> https://cwiki.apache.org/Hive/hbaseintegration.html.
>
>
>
> Hadoop 1.0.4
>
> Hbase 0.94.2
>
> Hive 0.9.0
>
>
>
> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
> hive and trying to create the table as described in the above link, the
> following error is thrown:
>
>
>
> hive> CREATE TABLE hbase_table_1(key int, value string)
>
>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>
>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>
>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
>
> NestedThrowables:
>
> java.lang.reflect.InvocationTargetException
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
>
> Can anyone suggest what else need to be done in order to set up the
> integration?
>
>
>
> Thanks.
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Barak,

     Are you using the default metadata store(derby) or tying to use
something else like MySQL?If latter is the case, make sure you have the
necessary connector in place. Also, is your Hive working fine independently?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
>
>
> We would like to integrate hive with hbase, so we are following the
> instructions listed here
> https://cwiki.apache.org/Hive/hbaseintegration.html.
>
>
>
> Hadoop 1.0.4
>
> Hbase 0.94.2
>
> Hive 0.9.0
>
>
>
> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
> hive and trying to create the table as described in the above link, the
> following error is thrown:
>
>
>
> hive> CREATE TABLE hbase_table_1(key int, value string)
>
>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>
>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>
>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
>
> NestedThrowables:
>
> java.lang.reflect.InvocationTargetException
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
>
> Can anyone suggest what else need to be done in order to set up the
> integration?
>
>
>
> Thanks.
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Barak,

     Are you using the default metadata store(derby) or tying to use
something else like MySQL?If latter is the case, make sure you have the
necessary connector in place. Also, is your Hive working fine independently?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
>
>
> We would like to integrate hive with hbase, so we are following the
> instructions listed here
> https://cwiki.apache.org/Hive/hbaseintegration.html.
>
>
>
> Hadoop 1.0.4
>
> Hbase 0.94.2
>
> Hive 0.9.0
>
>
>
> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
> hive and trying to create the table as described in the above link, the
> following error is thrown:
>
>
>
> hive> CREATE TABLE hbase_table_1(key int, value string)
>
>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>
>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>
>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
>
> NestedThrowables:
>
> java.lang.reflect.InvocationTargetException
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
>
> Can anyone suggest what else need to be done in order to set up the
> integration?
>
>
>
> Thanks.
>

Re: hbase hive integration - "FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory"

Posted by Mohammad Tariq <do...@gmail.com>.
Hello Barak,

     Are you using the default metadata store(derby) or tying to use
something else like MySQL?If latter is the case, make sure you have the
necessary connector in place. Also, is your Hive working fine independently?

Regards,
    Mohammad Tariq



On Mon, Nov 26, 2012 at 6:13 PM, Barak Yaish <ba...@gmail.com> wrote:

> Hi,
>
>
>
> We would like to integrate hive with hbase, so we are following the
> instructions listed here
> https://cwiki.apache.org/Hive/hbaseintegration.html.
>
>
>
> Hadoop 1.0.4
>
> Hbase 0.94.2
>
> Hive 0.9.0
>
>
>
> I’ve updated hadoop-env.sh HADOOP_CLASSPATH with hive jars. When running
> hive and trying to create the table as described in the above link, the
> following error is thrown:
>
>
>
> hive> CREATE TABLE hbase_table_1(key int, value string)
>
>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>
>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>
>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory
>
> NestedThrowables:
>
> java.lang.reflect.InvocationTargetException
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
>
> Can anyone suggest what else need to be done in order to set up the
> integration?
>
>
>
> Thanks.
>