You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by "Atul Paldhikar (apaldhik)" <ap...@cisco.com> on 2014/11/26 08:26:07 UTC

Error load data using Sqoop from Oracle

Hi All,

I made the data load work somehow but I don't think it's the right way ! Here is what I tried (sorry for the long email)


1.       Hadoop 2.5.1 +  sqoop-1.4.5.bin__hadoop-2.0.4-alpha : This failed with the following error. I did see same issue asked on stackoverflow but no fix. This is most likely due to some jar files mismatch.

Note: /tmp/sqoop-sas/compile/1ee9265317c9d91a077060f857c3e726/TEMP_ADDRESS.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/11/24 23:05:43 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-sas/compile/1ee9265317c9d91a077060f857c3e726/TEMP_ADDRESS.jar
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/InputFormat
        at java.lang.ClassLoader.defineClass1(Native Method)



2.       Hadoop 2.5.1 +  sqoop-1.4.5.bin__hadoop-1.0.0 : This also failed with some Class version mismatch error. So I downloaded the Hadoop_1.2.1 also and specified that as the HADOOP_MAPRED_PATH in the Sqoop configuration (to pick up the hadoop_core.jar). This worked for both HDFS and Hive imports from Oracle :)



However there are 2 issues

2.1   : I don't want to maintain 2 versions of Hadoop

2.2   : The imported file is not actually going on the HDFS that I have setup, instead going on the local file system. For example when I load a file in a Hive table in "hive", it is visible on 'hdfs://finattr-comp-dev-01:9999/apps/sas/hive/warehouse/<table> whereas with the Sqoop import the file is landing in /apps/sas/hive/warehouse/<table>.

Thanks
- Atul

Re: Error load data using Sqoop from Oracle

Posted by Deepak Vohra <dv...@yahoo.com>.
Atul,
1. Regaring the different versions of Hadoop, "use Sqoop 2 only if it contains all the features required for your use case, otherwise, continue to use Sqoop 1."Feature Differences - Sqoop 1 and Sqoop 2

|   |
|   |   |   |   |   |
| Feature Differences - Sqoop 1 and Sqoop 2Connectors for all major RDBMS Supported. Not supported.  |
|  |
| View on www.cloudera.com | Preview by Yahoo |
|  |
|   |


Use the following combination of Sqoop and Hadoop:
Hadoop 2.0.0 CDH 4.6Sqoop 1.4.3 CDH 4.6
2. Regarding import into local filesystem, specify the target directory with hdfs: --target-dir "hdfs://finattr-comp-dev-01:9999/apps/sas/hive/warehouse/<table>" 


thanks,Deepak
      From: Atul Paldhikar (apaldhik) <ap...@cisco.com>
 To: "user@sqoop.apache.org" <us...@sqoop.apache.org> 
 Sent: Tuesday, November 25, 2014 11:26 PM
 Subject: Error load data using Sqoop from Oracle
   
 <!--#yiv3771053904 _filtered #yiv3771053904 {font-family:Wingdings;panose-1:5 0 0 0 0 0 0 0 0 0;} _filtered #yiv3771053904 {font-family:Wingdings;panose-1:5 0 0 0 0 0 0 0 0 0;} _filtered #yiv3771053904 {font-family:Calibri;panose-1:2 15 5 2 2 2 4 3 2 4;}#yiv3771053904 #yiv3771053904 p.yiv3771053904MsoNormal, #yiv3771053904 li.yiv3771053904MsoNormal, #yiv3771053904 div.yiv3771053904MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:11.0pt;font-family:"Calibri", "sans-serif";}#yiv3771053904 a:link, #yiv3771053904 span.yiv3771053904MsoHyperlink {color:blue;text-decoration:underline;}#yiv3771053904 a:visited, #yiv3771053904 span.yiv3771053904MsoHyperlinkFollowed {color:purple;text-decoration:underline;}#yiv3771053904 p.yiv3771053904MsoListParagraph, #yiv3771053904 li.yiv3771053904MsoListParagraph, #yiv3771053904 div.yiv3771053904MsoListParagraph {margin-top:0in;margin-right:0in;margin-bottom:0in;margin-left:.5in;margin-bottom:.0001pt;font-size:11.0pt;font-family:"Calibri", "sans-serif";}#yiv3771053904 span.yiv3771053904EmailStyle17 {font-family:"Calibri", "sans-serif";color:windowtext;}#yiv3771053904 .yiv3771053904MsoChpDefault {font-family:"Calibri", "sans-serif";} _filtered #yiv3771053904 {margin:1.0in 1.0in 1.0in 1.0in;}#yiv3771053904 div.yiv3771053904WordSection1 {}#yiv3771053904 _filtered #yiv3771053904 {} _filtered #yiv3771053904 {} _filtered #yiv3771053904 {margin-left:.75in;} _filtered #yiv3771053904 {margin-left:1.25in;} _filtered #yiv3771053904 {margin-left:1.5in;} _filtered #yiv3771053904 {margin-left:2.0in;} _filtered #yiv3771053904 {margin-left:2.25in;} _filtered #yiv3771053904 {margin-left:2.75in;} _filtered #yiv3771053904 {margin-left:3.0in;} _filtered #yiv3771053904 {margin-left:3.25in;}#yiv3771053904 ol {margin-bottom:0in;}#yiv3771053904 ul {margin-bottom:0in;}-->Hi All,    I made the data load work somehow but I don’t think it’s the right way ! Here is what I tried (sorry for the long email)    1.      Hadoop 2.5.1 +  sqoop-1.4.5.bin__hadoop-2.0.4-alpha : This failed with the following error. I did see same issue asked on stackoverflow but no fix. This is most likely due to some jar files mismatch.    Note: /tmp/sqoop-sas/compile/1ee9265317c9d91a077060f857c3e726/TEMP_ADDRESS.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 14/11/24 23:05:43 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-sas/compile/1ee9265317c9d91a077060f857c3e726/TEMP_ADDRESS.jar Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/InputFormat         at java.lang.ClassLoader.defineClass1(Native Method)    2.      Hadoop 2.5.1 +  sqoop-1.4.5.bin__hadoop-1.0.0 : This also failed with some Class version mismatch error. So I downloaded the Hadoop_1.2.1 also and specified that as the HADOOP_MAPRED_PATH in the Sqoop configuration (to pick up the hadoop_core.jar). This worked for both HDFS and Hive imports from OracleJ    However there are 2 issues 2.1  : I don’t want to maintain 2 versions of Hadoop 2.2  : The imported file is not actually going on the HDFS that I have setup, instead going on thelocal file system. For example when I load a file in a Hive table in “hive”, it is visible on 'hdfs://finattr-comp-dev-01:9999/apps/sas/hive/warehouse/<table> whereas with the Sqoop import the file is landing in /apps/sas/hive/warehouse/<table>.    Thanks - Atul