You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by ws <wl...@yahoo.com> on 2016/12/13 15:04:51 UTC

Sqoop potgres table to hive in parquet format problem.

Hive: 2.1.0Sqoop: 1.4.6
###hive> select * from dimemployee;OKFailed with exception java.io.IOException:java.lang.RuntimeException: hdfs://ip-172-31-38-171.ec2.internal:8020/user/hive/warehouse/db2.db/dimemployee/.metadata/schemas/1.avsc is not a Parquet file. expected magic number at tail [80, 65, 82, 49] but found [101, 34, 10, 125]Time taken: 0.068 seconds###
However the parquet file exists in:hadoop fs lst /user/hive/warehouse/db2.db/dimemployeFound 2 itemsdrwxr-xr-x   - root hadoop          0 2016-12-13 14:34 /user/hive/warehouse/db2.db/dimemployee/.metadata-rw-r--r--   1 root hadoop      36970 2016-12-13 14:34 /user/hive/warehouse/db2.db/dimemployee/7816cb19-4616-4dda-89d3-e2a0dc42b7e4.parquet####
Any hint greatly appreciatedThanksws

Re: Sqoop potgres table to hive in parquet format problem.

Posted by Sharath Punreddy <sr...@gmail.com>.
Looks like you are using Avro to read Parquet file.

Sincerely,
Sharath Punreddy
Email:srpunreddy@gmail.com
Phone: 918-973-3399

On Tue, Dec 13, 2016 at 9:04 AM, ws <wl...@yahoo.com> wrote:

> Hive: 2.1.0
> Sqoop: 1.4.6
>
> ###
> hive> select * from dimemployee;
> OK
> Failed with exception java.io.IOException:java.lang.RuntimeException:
> hdfs://ip-172-31-38-171.ec2.internal:8020/user/hive/
> warehouse/db2.db/dimemployee/.metadata/schemas/1.avsc is not a Parquet
> file. expected magic number at tail [80, 65, 82, 49] but found [101, 34,
> 10, 125]
> Time taken: 0.068 seconds
> ###
>
> However the parquet file exists in:
> hadoop fs lst /user/hive/warehouse/db2.db/dimemploye
> Found 2 items
> drwxr-xr-x   - root hadoop          0 2016-12-13 14:34
> /user/hive/warehouse/db2.db/dimemployee/.metadata
> -rw-r--r--   1 root hadoop      36970 2016-12-13 14:34
> /user/hive/warehouse/db2.db/dimemployee/7816cb19-4616-
> 4dda-89d3-e2a0dc42b7e4.parquet
> ####
>
> Any hint greatly appreciated
> Thanks
> ws
>