You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Raj Hadoop <ha...@yahoo.com> on 2013/11/03 16:39:55 UTC

Oracle to HDFS through Sqoop and a Hive External Table

Hi,

I am sending this to the three dist-lists of Hadoop, Hive and Sqoop as this question is closely related to all the three areas.

I have this requirement.

I have a big table in Oracle (about 60 million rows - Primary Key Customer Id). I want to bring this to HDFS and then create
a Hive external table. My requirement is running queries on this Hive table (at this time i do not know what queries i would be running).

Is the following a good design for the above problem ? Any pros and cons of this.


1) Load the table to HDFS using Sqoop into multiple folders (divide Customer Id's into 100 segments).
2) Create Hive external partition table based on the above 100 HDFS directories.


Thanks,
Raj