You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Krishna Rao <kr...@gmail.com> on 2012/12/20 11:37:28 UTC
Export SequenceFile compressed Hive table
Hi all,
Is it possible to export a SequenceFile compressed Hive table using Sqoop?
It's absence from the docs, and
this<https://issues.cloudera.org/browse/SQOOP-93>,
suggest I can't.
And attempting it results in:
***
12/12/20 10:15:13 INFO mapreduce.Job: Task Id :
attempt_1354275827266_5834_m_000000_2, Status : FAILED
Error: java.lang.ClassCastException: org.apache.hadoop.io.BytesWritable
cannot be cast to org.apache.hadoop.io.LongWritable
at
org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
at
org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
at
org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:77)
at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:470)
at
org.apache.hadoop.mapreduce.task.MapContextImpl.getCurrentKey(MapContextImpl.java:70)
at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.getCurrentKey(WrappedMapper.java:81)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
***
Cheers,
Krishna
Re: Export SequenceFile compressed Hive table
Posted by Chalcy <ch...@gmail.com>.
Hi Krishna,
No. Sqoop import into hive is not possible yet.
We use sqoop import snappy compressed into hive and then create another
same schema sequence file output hive table and do a insert overwrite from
non sequence to sequence table. This works very well except there is
another overhead.
For partitioned tables, the next select has to have all the fields
specified.
Hope this helps,
Chalcy
On Thu, Dec 20, 2012 at 5:37 AM, Krishna Rao <kr...@gmail.com> wrote:
> Hi all,
>
> Is it possible to export a SequenceFile compressed Hive table using Sqoop?
>
> It's absence from the docs, and this<https://issues.cloudera.org/browse/SQOOP-93>,
> suggest I can't.
>
> And attempting it results in:
> ***
> 12/12/20 10:15:13 INFO mapreduce.Job: Task Id :
> attempt_1354275827266_5834_m_000000_2, Status : FAILED
> Error: java.lang.ClassCastException: org.apache.hadoop.io.BytesWritable
> cannot be cast to org.apache.hadoop.io.LongWritable
> at
> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
> at
> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
> at
> org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:77)
> at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:470)
> at
> org.apache.hadoop.mapreduce.task.MapContextImpl.getCurrentKey(MapContextImpl.java:70)
> at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.getCurrentKey(WrappedMapper.java:81)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> ***
>
> Cheers,
>
> Krishna
>