You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Vaibhav V Nirkhe <va...@impetus.co.in> on 2013/10/09 14:49:12 UTC

Issue in Sqoop Export from HDFS(Hbase data) to MySql

Hi ,
       I am using Sqoop 1.4.3 on Hadoop 1.2.1 and trying to export HBase data placed in HDFS to MySQL , however I am getting following ClassCastException :-

I am using following command :-

sqoop export --connect jdbc:mysql://localhost:3306/OMS --username root -P --table CNT_REPORT_DATA --columns CUSTOMER_ID,MONTH  --export-dir /user/hduser/esr_data --verbose -m 1

I guess Sqoop is trying to fetch the record by its key and not able to cast the key :-

java.lang.ClassCastException: org.apache.hadoop.hbase.io.ImmutableBytesWritable cannot be cast to org.apache.hadoop.io.LongWritable
    at org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
    at org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
    at org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:503)
    at org.apache.hadoop.mapreduce.MapContext.getCurrentKey(MapContext.java:57)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)


I don't understand why the key is always expected to be LongWritable here ?  Please suggest asap .



Thanks in advance,



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Issue in Sqoop Export from HDFS(Hbase data) to MySql

Posted by Abraham Elmahrek <ab...@cloudera.com>.
Ah I see! Sorry, I didn't understand that!

Hbase should dump data into a sequence file I believe. You might have to
transform that into another format (perhaps with Hive or Pig?) and then
export the data.

Alternatively, you might be able to upgrade and use the HCatalog
integration. Checkout
http://mail-archives.apache.org/mod_mbox/sqoop-user/201308.mbox/%3C20130824230724.GE4943@localhost%3E
.

-Abe


On Wed, Oct 9, 2013 at 10:12 AM, Vaibhav V Nirkhe <
vaibhav.nirkhe@impetus.co.in> wrote:

>  Hi Abraham,
>                      Thanks !! I have done the same I  have exported the
> data from hbase to hdfs and exported file is placed at */user/hduser/esr_data
> *still I am getting this exception . Please let me know what is wrong
> below .
>
> One thing I could observe is exported file from Hbase seems to have
> serialized objects instead of  tsv data . But dont know how to get .tsv
> format through hbase export .
>
> Thanks and regards,
> Vaibhav Nirkhe
>  ------------------------------
> *From:* Abraham Elmahrek [abe@cloudera.com]
> *Sent:* Wednesday, October 09, 2013 10:34 PM
> *To:* user@sqoop.apache.org
> *Subject:* Re: Issue in Sqoop Export from HDFS(Hbase data) to MySql
>
>   User,
>
>  Hbase exporting is currently not supported in Sqoop.
>
>  What you can do is export the Hbase data into HDFS first, then use Sqoop
> to transfer it into MySQL.
>
>  -Abe
>
>
> On Wed, Oct 9, 2013 at 5:49 AM, Vaibhav V Nirkhe <
> vaibhav.nirkhe@impetus.co.in> wrote:
>
>>  Hi ,
>>         I am using Sqoop 1.4.3 on Hadoop 1.2.1 and trying to export
>> HBase data placed in HDFS to MySQL , however I am getting following
>> ClassCastException :-
>>
>> I am using following command :-
>>
>> sqoop export --connect jdbc:mysql://localhost:3306/OMS --username root -P
>> --table CNT_REPORT_DATA --columns CUSTOMER_ID,MONTH  --export-dir
>> /user/hduser/esr_data --verbose -m 1
>>
>> I guess Sqoop is trying to fetch the record by its key and not able to
>> cast the key :-
>>
>> java.lang.ClassCastException:
>> org.apache.hadoop.hbase.io.ImmutableBytesWritable cannot be cast to
>> org.apache.hadoop.io.LongWritable
>>     at
>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
>>     at
>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
>>     at
>> org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
>>     at
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:503)
>>     at
>> org.apache.hadoop.mapreduce.MapContext.getCurrentKey(MapContext.java:57)
>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>>     at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>
>>
>> I don't understand why the key is always expected to be LongWritable here
>> ?  Please suggest asap .
>>
>>
>>
>> Thanks in advance,
>>
>>
>>
>> ------------------------------
>>
>>
>>
>>
>>
>>
>> NOTE: This message may contain information that is confidential,
>> proprietary, privileged or otherwise protected by law. The message is
>> intended solely for the named addressee. If received in error, please
>> destroy and notify the sender. Any use of this email is prohibited when
>> received in error. Impetus does not represent, warrant and/or guarantee,
>> that the integrity of this communication has been maintained nor that the
>> communication is free of errors, virus, interception or interference.
>>
>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>

RE: Issue in Sqoop Export from HDFS(Hbase data) to MySql

Posted by Vaibhav V Nirkhe <va...@impetus.co.in>.
Hi Abraham,
                     Thanks !! I have done the same I  have exported the data from hbase to hdfs and exported file is placed at /user/hduser/esr_data still I am getting this exception . Please let me know what is wrong below .

One thing I could observe is exported file from Hbase seems to have serialized objects instead of  tsv data . But dont know how to get .tsv format through hbase export .

Thanks and regards,
Vaibhav Nirkhe
________________________________
From: Abraham Elmahrek [abe@cloudera.com]
Sent: Wednesday, October 09, 2013 10:34 PM
To: user@sqoop.apache.org
Subject: Re: Issue in Sqoop Export from HDFS(Hbase data) to MySql

User,

Hbase exporting is currently not supported in Sqoop.

What you can do is export the Hbase data into HDFS first, then use Sqoop to transfer it into MySQL.

-Abe


On Wed, Oct 9, 2013 at 5:49 AM, Vaibhav V Nirkhe <va...@impetus.co.in>> wrote:
Hi ,
       I am using Sqoop 1.4.3 on Hadoop 1.2.1 and trying to export HBase data placed in HDFS to MySQL , however I am getting following ClassCastException :-

I am using following command :-

sqoop export --connect jdbc:mysql://localhost:3306/OMS --username root -P --table CNT_REPORT_DATA --columns CUSTOMER_ID,MONTH  --export-dir /user/hduser/esr_data --verbose -m 1

I guess Sqoop is trying to fetch the record by its key and not able to cast the key :-

java.lang.ClassCastException: org.apache.hadoop.hbase.io.ImmutableBytesWritable cannot be cast to org.apache.hadoop.io.LongWritable
    at org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
    at org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
    at org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:503)
    at org.apache.hadoop.mapreduce.MapContext.getCurrentKey(MapContext.java:57)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)


I don't understand why the key is always expected to be LongWritable here ?  Please suggest asap .



Thanks in advance,



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Issue in Sqoop Export from HDFS(Hbase data) to MySql

Posted by Abraham Elmahrek <ab...@cloudera.com>.
User,

Hbase exporting is currently not supported in Sqoop.

What you can do is export the Hbase data into HDFS first, then use Sqoop to
transfer it into MySQL.

-Abe


On Wed, Oct 9, 2013 at 5:49 AM, Vaibhav V Nirkhe <
vaibhav.nirkhe@impetus.co.in> wrote:

>  Hi ,
>         I am using Sqoop 1.4.3 on Hadoop 1.2.1 and trying to export HBase
> data placed in HDFS to MySQL , however I am getting following
> ClassCastException :-
>
> I am using following command :-
>
> sqoop export --connect jdbc:mysql://localhost:3306/OMS --username root -P
> --table CNT_REPORT_DATA --columns CUSTOMER_ID,MONTH  --export-dir
> /user/hduser/esr_data --verbose -m 1
>
> I guess Sqoop is trying to fetch the record by its key and not able to
> cast the key :-
>
> java.lang.ClassCastException:
> org.apache.hadoop.hbase.io.ImmutableBytesWritable cannot be cast to
> org.apache.hadoop.io.LongWritable
>     at
> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
>     at
> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
>     at
> org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
>     at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:503)
>     at
> org.apache.hadoop.mapreduce.MapContext.getCurrentKey(MapContext.java:57)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>     at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
>
> I don't understand why the key is always expected to be LongWritable here
> ?  Please suggest asap .
>
>
>
> Thanks in advance,
>
>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>