You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by rk vishu <ta...@gmail.com> on 2012/01/26 23:09:47 UTC
Map Red SequenceFile output to Hive table
Hello All,
I have a mapred job that does transfermation and outputs to a compresses
SequenceFile (by using org.apache.hadoop.mapred.SequenceFileOutputFormat)
I am able to attach the output to a external hive table (stored as
sequncefile). When i query it ignores the first column value from the file.
Is there a way to generate the MAP Red out put as expected by HIVE?
Any tips on this are highly appriciated.
-R
Re: Map Red SequenceFile output to Hive table
Posted by Mapred Learn <ma...@gmail.com>.
Are you missing external keyword here:
create external table...<>
On Thu, Jan 26, 2012 at 5:21 PM, rk vishu <ta...@gmail.com> wrote:
> Something like below.
>
> CREATE TABLE stg.my_tab(
> col1 String,
> col2 String,
> col3 String
> ) row format delimited fields terminated by '\t' lines terminated by '\n'
> stored as sequencefile
> location '/xyz/mytable/;
> LOAD DATA INPATH '/tmp/mymapredout/part-*' INTO TABLE stg.my_tab
> ;
>
>
> On Thu, Jan 26, 2012 at 3:49 PM, Mapred Learn <mapred.learn@gmail.com
> >wrote:
>
> > Can u share your create table command ?
> >
> > Sent from my iPhone
> >
> > On Jan 26, 2012, at 2:21 PM, rk vishu <ta...@gmail.com> wrote:
> >
> > > I did specify the first column in the table creation.
> > >
> > > On Thu, Jan 26, 2012 at 2:15 PM, Mapred Learn <mapred.learn@gmail.com
> > >wrote:
> > >
> > >> In your external table creation, do you specify the first column ?
> > >>
> > >> Sent from my iPhone
> > >>
> > >> On Jan 26, 2012, at 2:09 PM, rk vishu <ta...@gmail.com> wrote:
> > >>
> > >>> Hello All,
> > >>>
> > >>> I have a mapred job that does transfermation and outputs to a
> > compresses
> > >>> SequenceFile (by using
> > org.apache.hadoop.mapred.SequenceFileOutputFormat)
> > >>>
> > >>> I am able to attach the output to a external hive table (stored as
> > >>> sequncefile). When i query it ignores the first column value from the
> > >> file.
> > >>> Is there a way to generate the MAP Red out put as expected by HIVE?
> > >>>
> > >>> Any tips on this are highly appriciated.
> > >>>
> > >>>
> > >>> -R
> > >>
> >
>
Re: Map Red SequenceFile output to Hive table
Posted by rk vishu <ta...@gmail.com>.
Something like below.
CREATE TABLE stg.my_tab(
col1 String,
col2 String,
col3 String
) row format delimited fields terminated by '\t' lines terminated by '\n'
stored as sequencefile
location '/xyz/mytable/;
LOAD DATA INPATH '/tmp/mymapredout/part-*' INTO TABLE stg.my_tab
;
On Thu, Jan 26, 2012 at 3:49 PM, Mapred Learn <ma...@gmail.com>wrote:
> Can u share your create table command ?
>
> Sent from my iPhone
>
> On Jan 26, 2012, at 2:21 PM, rk vishu <ta...@gmail.com> wrote:
>
> > I did specify the first column in the table creation.
> >
> > On Thu, Jan 26, 2012 at 2:15 PM, Mapred Learn <mapred.learn@gmail.com
> >wrote:
> >
> >> In your external table creation, do you specify the first column ?
> >>
> >> Sent from my iPhone
> >>
> >> On Jan 26, 2012, at 2:09 PM, rk vishu <ta...@gmail.com> wrote:
> >>
> >>> Hello All,
> >>>
> >>> I have a mapred job that does transfermation and outputs to a
> compresses
> >>> SequenceFile (by using
> org.apache.hadoop.mapred.SequenceFileOutputFormat)
> >>>
> >>> I am able to attach the output to a external hive table (stored as
> >>> sequncefile). When i query it ignores the first column value from the
> >> file.
> >>> Is there a way to generate the MAP Red out put as expected by HIVE?
> >>>
> >>> Any tips on this are highly appriciated.
> >>>
> >>>
> >>> -R
> >>
>
Re: Map Red SequenceFile output to Hive table
Posted by Mapred Learn <ma...@gmail.com>.
Can u share your create table command ?
Sent from my iPhone
On Jan 26, 2012, at 2:21 PM, rk vishu <ta...@gmail.com> wrote:
> I did specify the first column in the table creation.
>
> On Thu, Jan 26, 2012 at 2:15 PM, Mapred Learn <ma...@gmail.com>wrote:
>
>> In your external table creation, do you specify the first column ?
>>
>> Sent from my iPhone
>>
>> On Jan 26, 2012, at 2:09 PM, rk vishu <ta...@gmail.com> wrote:
>>
>>> Hello All,
>>>
>>> I have a mapred job that does transfermation and outputs to a compresses
>>> SequenceFile (by using org.apache.hadoop.mapred.SequenceFileOutputFormat)
>>>
>>> I am able to attach the output to a external hive table (stored as
>>> sequncefile). When i query it ignores the first column value from the
>> file.
>>> Is there a way to generate the MAP Red out put as expected by HIVE?
>>>
>>> Any tips on this are highly appriciated.
>>>
>>>
>>> -R
>>
Re: Map Red SequenceFile output to Hive table
Posted by rk vishu <ta...@gmail.com>.
I did specify the first column in the table creation.
On Thu, Jan 26, 2012 at 2:15 PM, Mapred Learn <ma...@gmail.com>wrote:
> In your external table creation, do you specify the first column ?
>
> Sent from my iPhone
>
> On Jan 26, 2012, at 2:09 PM, rk vishu <ta...@gmail.com> wrote:
>
> > Hello All,
> >
> > I have a mapred job that does transfermation and outputs to a compresses
> > SequenceFile (by using org.apache.hadoop.mapred.SequenceFileOutputFormat)
> >
> > I am able to attach the output to a external hive table (stored as
> > sequncefile). When i query it ignores the first column value from the
> file.
> > Is there a way to generate the MAP Red out put as expected by HIVE?
> >
> > Any tips on this are highly appriciated.
> >
> >
> > -R
>
Re: Map Red SequenceFile output to Hive table
Posted by Mapred Learn <ma...@gmail.com>.
In your external table creation, do you specify the first column ?
Sent from my iPhone
On Jan 26, 2012, at 2:09 PM, rk vishu <ta...@gmail.com> wrote:
> Hello All,
>
> I have a mapred job that does transfermation and outputs to a compresses
> SequenceFile (by using org.apache.hadoop.mapred.SequenceFileOutputFormat)
>
> I am able to attach the output to a external hive table (stored as
> sequncefile). When i query it ignores the first column value from the file.
> Is there a way to generate the MAP Red out put as expected by HIVE?
>
> Any tips on this are highly appriciated.
>
>
> -R