You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Sravya Tirukkovalur <sr...@cloudera.com> on 2013/09/22 03:41:52 UTC

External table with Avro Serde

This command fails:

CREATE EXTERNAL TABLE avrotab
PARTITIONED BY(dummy int)
LOCATION '/user/user1/avrodata'
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
TBLPROPERTIES ( 'avro.schema.url'='file:///tmp/data.avsc');

Error:
Error: Error while processing statement: FAILED: ParseException line 1:131
missing EOF at 'ROW' near ''/user/user1/avrodata'' (state=42000,code=40000)

Not sure what I am missing. Can anyone see any obvious mistake?

Thanks!
-- 
Sravya Tirukkovalur

Re: External table with Avro Serde

Posted by Sravya Tirukkovalur <sr...@cloudera.com>.
Thank you, that was helpful!


On Sat, Sep 21, 2013 at 7:02 PM, j.barrett Strausser <
j.barrett.strausser@gmail.com> wrote:

> Fairly sure you just forgot the 'Stored As' clause.
>
> The below works for me on .11
>
> CREATE EXTERNAL TABLE avrotab
> PARTITIONED BY(dq_dummy int)
> ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
> STORED AS
> INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
> OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
> LOCATION '/user/user1/avrodata'
>
> TBLPROPERTIES ( 'avro.schema.literal'='{
>   "namespace": "com.poweredanalytics.serializer",
>   "name": "help_hive_serializer",
>   "type": "record",
>   "fields": [
>     { "name":"dummy", "type":"int" },
>     { "name":"other", "type":["int","null"] }
>   ] }
> ')
> ;
>
>
>
>
>
>
>
> On Sat, Sep 21, 2013 at 9:41 PM, Sravya Tirukkovalur <sr...@cloudera.com>wrote:
>
>> This command fails:
>>
>> CREATE EXTERNAL TABLE avrotab
>> PARTITIONED BY(dummy int)
>> LOCATION '/user/user1/avrodata'
>> ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
>> INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
>> OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
>> TBLPROPERTIES ( 'avro.schema.url'='file:///tmp/data.avsc');
>>
>> Error:
>> Error: Error while processing statement: FAILED: ParseException line
>> 1:131 missing EOF at 'ROW' near ''/user/user1/avrodata''
>> (state=42000,code=40000)
>>
>> Not sure what I am missing. Can anyone see any obvious mistake?
>>
>> Thanks!
>> --
>> Sravya Tirukkovalur
>>
>
>
>
> --
>
>
> https://github.com/bearrito
> @deepbearrito
>



-- 
Sravya Tirukkovalur

Re: External table with Avro Serde

Posted by "j.barrett Strausser" <j....@gmail.com>.
Fairly sure you just forgot the 'Stored As' clause.

The below works for me on .11

CREATE EXTERNAL TABLE avrotab
PARTITIONED BY(dq_dummy int)
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS
INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION '/user/user1/avrodata'

TBLPROPERTIES ( 'avro.schema.literal'='{
  "namespace": "com.poweredanalytics.serializer",
  "name": "help_hive_serializer",
  "type": "record",
  "fields": [
    { "name":"dummy", "type":"int" },
    { "name":"other", "type":["int","null"] }
  ] }
')
;







On Sat, Sep 21, 2013 at 9:41 PM, Sravya Tirukkovalur <sr...@cloudera.com>wrote:

> This command fails:
>
> CREATE EXTERNAL TABLE avrotab
> PARTITIONED BY(dummy int)
> LOCATION '/user/user1/avrodata'
> ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
> INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
> OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
> TBLPROPERTIES ( 'avro.schema.url'='file:///tmp/data.avsc');
>
> Error:
> Error: Error while processing statement: FAILED: ParseException line 1:131
> missing EOF at 'ROW' near ''/user/user1/avrodata'' (state=42000,code=40000)
>
> Not sure what I am missing. Can anyone see any obvious mistake?
>
> Thanks!
> --
> Sravya Tirukkovalur
>



-- 


https://github.com/bearrito
@deepbearrito