You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Ajit Kumar Shreevastava <Aj...@hcl.com> on 2013/03/05 14:54:13 UTC

Error while exporting table data from hive to Oracle through Sqoop

Hi All,

I am facing following issue while exporting table from hive to Oracle. Importing table from Oracle to Hive and HDFS is working fine. Please let me know where I lag. I am pasting my screen output here.


[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose  -m 1  --input-fields-terminated-by '\001'
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.
Enter password:
13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:oracle:thin:@10.99.42.11
13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new connection cache.
13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.OracleManager@2abe0e27
13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation
13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM BTTN_BKP t WHERE 1=0
13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER
13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters specified. Using regular API for making connection.
13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM BTTN_BKP t WHERE 1=0
13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU
13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH
13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN
13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK
13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME
13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS
13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS
13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA
13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA
13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP
13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2, DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2, KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12, BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2, BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2, BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93, UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12, PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2, BKGD_CPTN_COLR_PRSD_ID:2,
13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java
13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is /home/hadoop/hadoop-1.0.3/libexec/..
13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:
13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath
13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d
13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath
13/03/05 19:20:16 DEBUG orm.CompilationManager:   /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
Note: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e
13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class -> BTTN_BKP.class
13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of BTTN_BKP
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class org.apache.sqoop.mapreduce.ExportInputFormat
13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/paranamer-2.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-1.5.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/commons-io-1.4.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar
13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input bytes=184266237
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:   Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078 Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;
13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%
13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%
13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%
13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%
13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%
13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%
13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%
13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%
13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%
13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%
13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%
13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%
13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%
13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%
13/03/05 19:21:18 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_0, Status : FAILED
java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)
        at BTTN_BKP.parse(BTTN_BKP.java:1148)
       at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%
13/03/05 19:21:27 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_1, Status : FAILED
java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated

        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
        at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated

        at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
        at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
        at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)

13/03/05 19:21:48 WARN mapred.JobClient: Error reading task outputConnection timed out
13/03/05 19:22:09 WARN mapred.JobClient: Error reading task outputConnection timed out
13/03/05 19:22:09 INFO mapred.JobClient: Job complete: job_201303051835_0010
13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8
13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters
13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152
13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3
13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4
13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1
13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1
13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 110.4837 seconds (0 bytes/sec)
13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.
13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job failed!
[hadoop@NHCLT-PC44-2 sqoop-oper]$

Regards,
Ajit Kumar Shreevastava


::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.

----------------------------------------------------------------------------------------------------------------------------------------------------

Re: Error while exporting table data from hive to Oracle through Sqoop

Posted by abhijeet gaikwad <ab...@gmail.com>.
It seems your violating unique key constraint in second task attempt which
is obvious if  there is already some data committed by Sqoop in first
attempt. This is an issue with Sqoop!

>From the exception in first attempt it looks like there is some issue when
auto-generated class (BTTN_BKP.java in your case) tries parsing the data.
Can you validate the data being inserted (not the unique constraint but
some invalid data for some column in input files)?

If data is huge - a way of narrowing down your search for the problematic
row:
Run sqoop with just one mapper (-m 1) . Make sure you have one file that
contains the whole data. If multiple files, the way I can think of right
now is to run sqoop per file or merge the data in one file. I am not sure
if this info is enough for you to control number of mappers but the motive
here is to make sure only one map task is created for the job.
After running sqoop and getting that exception; looking at the inserted
data in the table, you can figure out which was the next batch that Sqoop
was trying to insert!
Try and find which is the erroneous row - usually a batch would contain
1000 rows (rows per insert statement (100) * no. of insert statements
(100)) before a commit is fired. So, check next 1000 rows! If you want to
narrow this down further set this in your command line:

$ sqoop import *-Dsqoop.export.records.per.statement=1 -D
sqoop.export.statements.per.transaction=1* --connect ...
This will make sure you commit after insertion of every row. Haven't tried
this with Oracle, but I was able to set the batch size using these options
for SqlServer/mysql.

Let us hope this helps you find any invalid data values if any! If no
invalid data, I would suggest continuing this discussion in sqoop
user/sqoop dev mailing lists; your still posting via hive user.

Thanks,
Abhijeet

On Wed, Mar 6, 2013 at 11:14 AM, Ajit Kumar Shreevastava <
Ajit.Shreevastava@hcl.com> wrote:

>  Hi Abhijeet,****
>
> Data is fine. Firstly map reducer running for 48% then failed. After that map reducer again tried to load the same data due to that unique constraints error came.****
>
> ** **
>
> Regards,****
>
> Ajit Kumar Shreevastava****
>
> abhijeet gaikwad <ab...@gmail.com> wrote:****
>
> ** **
>
>  + sqoop user
>
> The answer is in your exception! Check your data, your hitting unique key
> violation.
>
> Thanks,
> Abhijeet****
>
> On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <
> Ajit.Shreevastava@hcl.com> wrote:****
>
> Hi All,****
>
>  ****
>
> I am facing following issue while exporting table from hive to Oracle.
> Importing table from Oracle to Hive and HDFS is working fine. Please let me
> know where I lag. I am pasting my screen output here.****
>
>  ****
>
>  ****
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table
> BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose
> -m 1  --input-fields-terminated-by '\001'*****
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.****
>
> Please set $HBASE_HOME to the root of your HBase installation.****
>
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.****
>
> Enter password:****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> jdbc:oracle:thin:@10.99.42.11****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new
> connection cache.****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> ****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> org.apache.sqoop.manager.OracleManager@2abe0e27****
>
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query:
> SELECT t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection
> for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER**
> **
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters
> specified. Using regular API for making connection.****
>
> 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2,
> DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2,
> KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12,
> BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2,
> BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2,
> BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93,
> UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12,
> PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2,
> FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2,
> BKGD_CPTN_COLR_PRSD_ID:2,****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-1.0.3/libexec/..****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:*
> ***
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> ****
>
> Note:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> uses or overrides a deprecated API.****
>
> Note: Recompile with -Xlint:deprecation for details.****
>
> 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files
> in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e**
> **
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class
> -> BTTN_BKP.class****
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of
> BTTN_BKP****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class
> org.apache.sqoop.mapreduce.ExportInputFormat****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/paranamer-2.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/commons-io-1.4.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1*
> ***
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=184266237****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
> ****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078
> Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;****
>
> 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
> ****
>
> 13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%****
>
> 13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%****
>
> 13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%****
>
> 13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%****
>
> 13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%****
>
> 13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%****
>
> 13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%****
>
> 13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%****
>
> 13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%****
>
> 13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%****
>
> 13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%****
>
> 13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%****
>
> 13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%****
>
> 13/03/05 19:21:18 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_0, Status : FAILED****
>
> java.util.NoSuchElementException****
>
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)****
>
>         at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)****
>
>         at BTTN_BKP.parse(BTTN_BKP.java:1148)****
>
>        at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
>  ****
>
> 13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:21:27 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_1, Status : FAILED****
>
> java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique
> constraint (HDFSUSER.BTTN_BKP_PK) violated****
>
>  ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
> ****
>
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
> ****
>
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> ****
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint
> (HDFSUSER.BTTN_BKP_PK) violated****
>
>  ****
>
>         at
> oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
> ****
>
>         at
> oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)
> ****
>
>  ****
>
> 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Job complete:
> job_201303051835_0010****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 110.4837 seconds (0 bytes/sec)****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.****
>
> 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job
> failed!****
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$*****
>
> * *****
>
> *Regards,*****
>
> *Ajit Kumar Shreevastava*****
>
>
>
> ::DISCLAIMER::
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> ****
>
> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as
> information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in
> transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability
> on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the
> author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction,
> dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior
> written consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error
> please delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses
> and other defects.****
>
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> ****
>
> ** **
>

Re: Error while exporting table data from hive to Oracle through Sqoop

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Ajit,
would you mind upgrading to Sqoop 1.4.3 RC 0 [1]? It has been already voted to be released as the final 1.4.3, so it should be safe to use.

One of the improvements in 1.4.3 is SQOOP-720 [2] that significantly improves the error message in this scenario.

Jarcec

Links:
1: http://people.apache.org/~hshreedharan/sqoop-1.4.3-rc0/
2: https://issues.apache.org/jira/browse/SQOOP-720

On Wed, Mar 06, 2013 at 05:44:44AM +0000, Ajit Kumar Shreevastava wrote:
> Hi Abhijeet,
> 
> Data is fine. Firstly map reducer running for 48% then failed. After that map reducer again tried to load the same data due to that unique constraints error came.
> 
> 
> 
> Regards,
> 
> Ajit Kumar Shreevastava
> 
> abhijeet gaikwad <ab...@gmail.com>> wrote:
> 
> 
> + sqoop user
> 
> The answer is in your exception! Check your data, your hitting unique key violation.
> 
> Thanks,
> Abhijeet
> On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <Aj...@hcl.com>> wrote:
> Hi All,
> 
> I am facing following issue while exporting table from hive to Oracle. Importing table from Oracle to Hive and HDFS is working fine. Please let me know where I lag. I am pasting my screen output here.
> 
> 
> [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect jdbc:oracle:thin:@10.99.42.11:1521/clouddb<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb> --username HDFSUSER  --table BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose  -m 1  --input-fields-terminated-by '\001'
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> Enter password:
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:oracle:thin:@10.99.42.11<ma...@10.99.42.11>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new connection cache.
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.OracleManager@2abe0e27<ma...@2abe0e27>
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM BTTN_BKP t WHERE 1=0
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb>, using username: HDFSUSER
> 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters specified. Using regular API for making connection.
> 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
> 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
> 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM BTTN_BKP t WHERE 1=0
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2, DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2, KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12, BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2, BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2, BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93, UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12, PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2, BKGD_CPTN_COLR_PRSD_ID:2,
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
> 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is /home/hadoop/hadoop-1.0.3/libexec/..
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> Note: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class -> BTTN_BKP.class
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
> 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of BTTN_BKP
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class org.apache.sqoop.mapreduce.ExportInputFormat
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
> 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/paranamer-2.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-1.5.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/commons-io-1.4.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input bytes=184266237
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:   Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078 Locations:NHCLT-PC44-2.hclt.corp.hcl.in<http://NHCLT-PC44-2.hclt.corp.hcl.in>:;
> 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
> 13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%
> 13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%
> 13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%
> 13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%
> 13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%
> 13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%
> 13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%
> 13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%
> 13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%
> 13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%
> 13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%
> 13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%
> 13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%
> 13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%
> 13/03/05 19:21:18 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_0, Status : FAILED
> java.util.NoSuchElementException
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)
>         at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)
>         at BTTN_BKP.parse(BTTN_BKP.java:1148)
>        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%
> 13/03/05 19:21:27 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_1, Status : FAILED
> java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated
> 
>         at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
>         at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
>         at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
>         at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated
> 
>         at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
>         at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
>         at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)
> 
> 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task outputConnection timed out
> 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task outputConnection timed out
> 13/03/05 19:22:09 INFO mapred.JobClient: Job complete: job_201303051835_0010
> 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8
> 13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3
> 13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4
> 13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 110.4837 seconds (0 bytes/sec)
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.
> 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job failed!
> [hadoop@NHCLT-PC44-2 sqoop-oper]$
> 
> Regards,
> Ajit Kumar Shreevastava
> 
> 
> ::DISCLAIMER::
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior written consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses and other defects.
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> 

Re: Error while exporting table data from hive to Oracle through Sqoop

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Ajit,
would you mind upgrading to Sqoop 1.4.3 RC 0 [1]? It has been already voted to be released as the final 1.4.3, so it should be safe to use.

One of the improvements in 1.4.3 is SQOOP-720 [2] that significantly improves the error message in this scenario.

Jarcec

Links:
1: http://people.apache.org/~hshreedharan/sqoop-1.4.3-rc0/
2: https://issues.apache.org/jira/browse/SQOOP-720

On Wed, Mar 06, 2013 at 05:44:44AM +0000, Ajit Kumar Shreevastava wrote:
> Hi Abhijeet,
> 
> Data is fine. Firstly map reducer running for 48% then failed. After that map reducer again tried to load the same data due to that unique constraints error came.
> 
> 
> 
> Regards,
> 
> Ajit Kumar Shreevastava
> 
> abhijeet gaikwad <ab...@gmail.com>> wrote:
> 
> 
> + sqoop user
> 
> The answer is in your exception! Check your data, your hitting unique key violation.
> 
> Thanks,
> Abhijeet
> On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <Aj...@hcl.com>> wrote:
> Hi All,
> 
> I am facing following issue while exporting table from hive to Oracle. Importing table from Oracle to Hive and HDFS is working fine. Please let me know where I lag. I am pasting my screen output here.
> 
> 
> [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect jdbc:oracle:thin:@10.99.42.11:1521/clouddb<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb> --username HDFSUSER  --table BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose  -m 1  --input-fields-terminated-by '\001'
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> Enter password:
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:oracle:thin:@10.99.42.11<ma...@10.99.42.11>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new connection cache.
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.OracleManager@2abe0e27<ma...@2abe0e27>
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM BTTN_BKP t WHERE 1=0
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb>, using username: HDFSUSER
> 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters specified. Using regular API for making connection.
> 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
> 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
> 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM BTTN_BKP t WHERE 1=0
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2, DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2, KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12, BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2, BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2, BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93, UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12, PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2, BKGD_CPTN_COLR_PRSD_ID:2,
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
> 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is /home/hadoop/hadoop-1.0.3/libexec/..
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> Note: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class -> BTTN_BKP.class
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
> 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of BTTN_BKP
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class org.apache.sqoop.mapreduce.ExportInputFormat
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
> 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/paranamer-2.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-1.5.3.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/commons-io-1.4.jar
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input bytes=184266237
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:   Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078 Locations:NHCLT-PC44-2.hclt.corp.hcl.in<http://NHCLT-PC44-2.hclt.corp.hcl.in>:;
> 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
> 13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%
> 13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%
> 13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%
> 13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%
> 13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%
> 13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%
> 13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%
> 13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%
> 13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%
> 13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%
> 13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%
> 13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%
> 13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%
> 13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%
> 13/03/05 19:21:18 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_0, Status : FAILED
> java.util.NoSuchElementException
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)
>         at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)
>         at BTTN_BKP.parse(BTTN_BKP.java:1148)
>        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%
> 13/03/05 19:21:27 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_1, Status : FAILED
> java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated
> 
>         at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
>         at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
>         at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
>         at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated
> 
>         at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
>         at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
>         at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)
> 
> 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task outputConnection timed out
> 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task outputConnection timed out
> 13/03/05 19:22:09 INFO mapred.JobClient: Job complete: job_201303051835_0010
> 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8
> 13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3
> 13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4
> 13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 110.4837 seconds (0 bytes/sec)
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.
> 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job failed!
> [hadoop@NHCLT-PC44-2 sqoop-oper]$
> 
> Regards,
> Ajit Kumar Shreevastava
> 
> 
> ::DISCLAIMER::
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior written consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses and other defects.
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> 

Re: Error while exporting table data from hive to Oracle through Sqoop

Posted by abhijeet gaikwad <ab...@gmail.com>.
It seems your violating unique key constraint in second task attempt which
is obvious if  there is already some data committed by Sqoop in first
attempt. This is an issue with Sqoop!

>From the exception in first attempt it looks like there is some issue when
auto-generated class (BTTN_BKP.java in your case) tries parsing the data.
Can you validate the data being inserted (not the unique constraint but
some invalid data for some column in input files)?

If data is huge - a way of narrowing down your search for the problematic
row:
Run sqoop with just one mapper (-m 1) . Make sure you have one file that
contains the whole data. If multiple files, the way I can think of right
now is to run sqoop per file or merge the data in one file. I am not sure
if this info is enough for you to control number of mappers but the motive
here is to make sure only one map task is created for the job.
After running sqoop and getting that exception; looking at the inserted
data in the table, you can figure out which was the next batch that Sqoop
was trying to insert!
Try and find which is the erroneous row - usually a batch would contain
1000 rows (rows per insert statement (100) * no. of insert statements
(100)) before a commit is fired. So, check next 1000 rows! If you want to
narrow this down further set this in your command line:

$ sqoop import *-Dsqoop.export.records.per.statement=1 -D
sqoop.export.statements.per.transaction=1* --connect ...
This will make sure you commit after insertion of every row. Haven't tried
this with Oracle, but I was able to set the batch size using these options
for SqlServer/mysql.

Let us hope this helps you find any invalid data values if any! If no
invalid data, I would suggest continuing this discussion in sqoop
user/sqoop dev mailing lists; your still posting via hive user.

Thanks,
Abhijeet

On Wed, Mar 6, 2013 at 11:14 AM, Ajit Kumar Shreevastava <
Ajit.Shreevastava@hcl.com> wrote:

>  Hi Abhijeet,****
>
> Data is fine. Firstly map reducer running for 48% then failed. After that map reducer again tried to load the same data due to that unique constraints error came.****
>
> ** **
>
> Regards,****
>
> Ajit Kumar Shreevastava****
>
> abhijeet gaikwad <ab...@gmail.com> wrote:****
>
> ** **
>
>  + sqoop user
>
> The answer is in your exception! Check your data, your hitting unique key
> violation.
>
> Thanks,
> Abhijeet****
>
> On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <
> Ajit.Shreevastava@hcl.com> wrote:****
>
> Hi All,****
>
>  ****
>
> I am facing following issue while exporting table from hive to Oracle.
> Importing table from Oracle to Hive and HDFS is working fine. Please let me
> know where I lag. I am pasting my screen output here.****
>
>  ****
>
>  ****
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table
> BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose
> -m 1  --input-fields-terminated-by '\001'*****
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.****
>
> Please set $HBASE_HOME to the root of your HBase installation.****
>
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.****
>
> Enter password:****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> jdbc:oracle:thin:@10.99.42.11****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new
> connection cache.****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> ****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> org.apache.sqoop.manager.OracleManager@2abe0e27****
>
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query:
> SELECT t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection
> for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER**
> **
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters
> specified. Using regular API for making connection.****
>
> 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2,
> DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2,
> KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12,
> BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2,
> BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2,
> BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93,
> UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12,
> PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2,
> FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2,
> BKGD_CPTN_COLR_PRSD_ID:2,****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-1.0.3/libexec/..****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:*
> ***
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> ****
>
> Note:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> uses or overrides a deprecated API.****
>
> Note: Recompile with -Xlint:deprecation for details.****
>
> 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files
> in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e**
> **
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class
> -> BTTN_BKP.class****
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of
> BTTN_BKP****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class
> org.apache.sqoop.mapreduce.ExportInputFormat****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/paranamer-2.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/commons-io-1.4.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1*
> ***
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=184266237****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
> ****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078
> Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;****
>
> 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
> ****
>
> 13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%****
>
> 13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%****
>
> 13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%****
>
> 13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%****
>
> 13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%****
>
> 13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%****
>
> 13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%****
>
> 13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%****
>
> 13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%****
>
> 13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%****
>
> 13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%****
>
> 13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%****
>
> 13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%****
>
> 13/03/05 19:21:18 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_0, Status : FAILED****
>
> java.util.NoSuchElementException****
>
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)****
>
>         at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)****
>
>         at BTTN_BKP.parse(BTTN_BKP.java:1148)****
>
>        at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
>  ****
>
> 13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:21:27 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_1, Status : FAILED****
>
> java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique
> constraint (HDFSUSER.BTTN_BKP_PK) violated****
>
>  ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
> ****
>
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
> ****
>
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> ****
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint
> (HDFSUSER.BTTN_BKP_PK) violated****
>
>  ****
>
>         at
> oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
> ****
>
>         at
> oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)
> ****
>
>  ****
>
> 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Job complete:
> job_201303051835_0010****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 110.4837 seconds (0 bytes/sec)****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.****
>
> 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job
> failed!****
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$*****
>
> * *****
>
> *Regards,*****
>
> *Ajit Kumar Shreevastava*****
>
>
>
> ::DISCLAIMER::
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> ****
>
> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as
> information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in
> transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability
> on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the
> author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction,
> dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior
> written consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error
> please delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses
> and other defects.****
>
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
> ****
>
> ** **
>

RE: Error while exporting table data from hive to Oracle through Sqoop

Posted by Ajit Kumar Shreevastava <Aj...@hcl.com>.
Hi Abhijeet,

Data is fine. Firstly map reducer running for 48% then failed. After that map reducer again tried to load the same data due to that unique constraints error came.



Regards,

Ajit Kumar Shreevastava

abhijeet gaikwad <ab...@gmail.com>> wrote:


+ sqoop user

The answer is in your exception! Check your data, your hitting unique key violation.

Thanks,
Abhijeet
On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <Aj...@hcl.com>> wrote:
Hi All,

I am facing following issue while exporting table from hive to Oracle. Importing table from Oracle to Hive and HDFS is working fine. Please let me know where I lag. I am pasting my screen output here.


[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect jdbc:oracle:thin:@10.99.42.11:1521/clouddb<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb> --username HDFSUSER  --table BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose  -m 1  --input-fields-terminated-by '\001'
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.
Enter password:
13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:oracle:thin:@10.99.42.11<ma...@10.99.42.11>
13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new connection cache.
13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.OracleManager@2abe0e27<ma...@2abe0e27>
13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation
13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM BTTN_BKP t WHERE 1=0
13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb>, using username: HDFSUSER
13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters specified. Using regular API for making connection.
13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM BTTN_BKP t WHERE 1=0
13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU
13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH
13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN
13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT
13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK
13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME
13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS
13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS
13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL
13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA
13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA
13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID
13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP
13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2, DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2, KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12, BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2, BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2, BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93, UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12, PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2, BKGD_CPTN_COLR_PRSD_ID:2,
13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java
13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is /home/hadoop/hadoop-1.0.3/libexec/..
13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:
13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath
13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d
13/03/05 19:20:16 DEBUG orm.CompilationManager:   /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/
13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath
13/03/05 19:20:16 DEBUG orm.CompilationManager:   /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
Note: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e
13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class -> BTTN_BKP.class
13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar
13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of BTTN_BKP
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class org.apache.sqoop.mapreduce.ExportInputFormat
13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER<http://jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER>
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/paranamer-2.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-1.5.3.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/commons-io-1.4.jar
13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar
13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input bytes=184266237
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process : 4
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:
13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:   Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078 Locations:NHCLT-PC44-2.hclt.corp.hcl.in<http://NHCLT-PC44-2.hclt.corp.hcl.in>:;
13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%
13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%
13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%
13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%
13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%
13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%
13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%
13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%
13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%
13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%
13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%
13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%
13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%
13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%
13/03/05 19:21:18 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_0, Status : FAILED
java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)
        at BTTN_BKP.parse(BTTN_BKP.java:1148)
       at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%
13/03/05 19:21:27 INFO mapred.JobClient: Task Id : attempt_201303051835_0010_m_000000_1, Status : FAILED
java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated

        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
        at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated

        at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
        at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
        at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)

13/03/05 19:21:48 WARN mapred.JobClient: Error reading task outputConnection timed out
13/03/05 19:22:09 WARN mapred.JobClient: Error reading task outputConnection timed out
13/03/05 19:22:09 INFO mapred.JobClient: Job complete: job_201303051835_0010
13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8
13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters
13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152
13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3
13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4
13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1
13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1
13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 110.4837 seconds (0 bytes/sec)
13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.
13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job failed!
[hadoop@NHCLT-PC44-2 sqoop-oper]$

Regards,
Ajit Kumar Shreevastava


::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------


Re: Error while exporting table data from hive to Oracle through Sqoop

Posted by abhijeet gaikwad <ab...@gmail.com>.
+ sqoop user

The answer is in your exception! Check your data, your hitting unique key
violation.

Thanks,
Abhijeet

On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <
Ajit.Shreevastava@hcl.com> wrote:

>  Hi All,****
>
> ** **
>
> I am facing following issue while exporting table from hive to Oracle.
> Importing table from Oracle to Hive and HDFS is working fine. Please let me
> know where I lag. I am pasting my screen output here.****
>
> ** **
>
> ** **
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table
> BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose
> -m 1  --input-fields-terminated-by '\001'*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.****
>
> Please set $HBASE_HOME to the root of your HBase installation.****
>
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.****
>
> Enter password:****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> jdbc:oracle:thin:@10.99.42.11****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new
> connection cache.****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> ****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> org.apache.sqoop.manager.OracleManager@2abe0e27****
>
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query:
> SELECT t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection
> for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER**
> **
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters
> specified. Using regular API for making connection.****
>
> 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2,
> DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2,
> KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12,
> BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2,
> BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2,
> BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93,
> UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12,
> PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2,
> FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2,
> BKGD_CPTN_COLR_PRSD_ID:2,****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-1.0.3/libexec/..****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:*
> ***
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> ****
>
> Note:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> uses or overrides a deprecated API.****
>
> Note: Recompile with -Xlint:deprecation for details.****
>
> 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files
> in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e**
> **
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class
> -> BTTN_BKP.class****
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of
> BTTN_BKP****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class
> org.apache.sqoop.mapreduce.ExportInputFormat****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/paranamer-2.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/commons-io-1.4.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1*
> ***
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=184266237****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
> ****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078
> Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;****
>
> 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
> ****
>
> 13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%****
>
> 13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%****
>
> 13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%****
>
> 13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%****
>
> 13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%****
>
> 13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%****
>
> 13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%****
>
> 13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%****
>
> 13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%****
>
> 13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%****
>
> 13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%****
>
> 13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%****
>
> 13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%****
>
> 13/03/05 19:21:18 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_0, Status : FAILED****
>
> java.util.NoSuchElementException****
>
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)****
>
>         at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)****
>
>         at BTTN_BKP.parse(BTTN_BKP.java:1148)****
>
>        at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> ** **
>
> 13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:21:27 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_1, Status : FAILED****
>
> java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique
> constraint (HDFSUSER.BTTN_BKP_PK) violated****
>
> ** **
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
> ****
>
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
> ****
>
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> ****
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint
> (HDFSUSER.BTTN_BKP_PK) violated****
>
> ** **
>
>         at
> oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
> ****
>
>         at
> oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)
> ****
>
> ** **
>
> 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Job complete:
> job_201303051835_0010****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 110.4837 seconds (0 bytes/sec)****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.****
>
> 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job
> failed!****
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$*
>
> * *
>
> *Regards,*
>
> *Ajit Kumar Shreevastava*
>
>
>
> ::DISCLAIMER::
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>
> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as
> information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in
> transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability
> on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the
> author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction,
> dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior
> written consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error
> please delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses
> and other defects.
>
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>

Re: Error while exporting table data from hive to Oracle through Sqoop

Posted by Dean Wampler <de...@thinkbiganalytics.com>.
>From the exceptions near the bottom, it looks like you're inserting data
that doesn't have unique keys, so it could be a data problem.

On Tue, Mar 5, 2013 at 7:54 AM, Ajit Kumar Shreevastava <
Ajit.Shreevastava@hcl.com> wrote:

>  Hi All,****
>
> ** **
>
> I am facing following issue while exporting table from hive to Oracle.
> Importing table from Oracle to Hive and HDFS is working fine. Please let me
> know where I lag. I am pasting my screen output here.****
>
> ** **
>
> ** **
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table
> BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose
> -m 1  --input-fields-terminated-by '\001'*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.****
>
> Please set $HBASE_HOME to the root of your HBase installation.****
>
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.****
>
> Enter password:****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> jdbc:oracle:thin:@10.99.42.11****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new
> connection cache.****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> ****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> org.apache.sqoop.manager.OracleManager@2abe0e27****
>
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query:
> SELECT t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection
> for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER**
> **
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters
> specified. Using regular API for making connection.****
>
> 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2,
> DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2,
> KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12,
> BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2,
> BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2,
> BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93,
> UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12,
> PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2,
> FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2,
> BKGD_CPTN_COLR_PRSD_ID:2,****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-1.0.3/libexec/..****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:*
> ***
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> ****
>
> Note:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> uses or overrides a deprecated API.****
>
> Note: Recompile with -Xlint:deprecation for details.****
>
> 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files
> in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e**
> **
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class
> -> BTTN_BKP.class****
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of
> BTTN_BKP****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class
> org.apache.sqoop.mapreduce.ExportInputFormat****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/paranamer-2.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/commons-io-1.4.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1*
> ***
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=184266237****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
> ****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078
> Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;****
>
> 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
> ****
>
> 13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%****
>
> 13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%****
>
> 13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%****
>
> 13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%****
>
> 13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%****
>
> 13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%****
>
> 13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%****
>
> 13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%****
>
> 13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%****
>
> 13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%****
>
> 13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%****
>
> 13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%****
>
> 13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%****
>
> 13/03/05 19:21:18 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_0, Status : FAILED****
>
> java.util.NoSuchElementException****
>
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)****
>
>         at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)****
>
>         at BTTN_BKP.parse(BTTN_BKP.java:1148)****
>
>        at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> ** **
>
> 13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:21:27 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_1, Status : FAILED****
>
> java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique
> constraint (HDFSUSER.BTTN_BKP_PK) violated****
>
> ** **
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
> ****
>
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
> ****
>
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> ****
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint
> (HDFSUSER.BTTN_BKP_PK) violated****
>
> ** **
>
>         at
> oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
> ****
>
>         at
> oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)
> ****
>
> ** **
>
> 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Job complete:
> job_201303051835_0010****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 110.4837 seconds (0 bytes/sec)****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.****
>
> 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job
> failed!****
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$*
>
> * *
>
> *Regards,*
>
> *Ajit Kumar Shreevastava*
>
>
>
> ::DISCLAIMER::
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>
> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as
> information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in
> transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability
> on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the
> author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction,
> dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior
> written consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error
> please delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses
> and other defects.
>
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>



-- 
*Dean Wampler, Ph.D.*
thinkbiganalytics.com
+1-312-339-1330

Re: Error while exporting table data from hive to Oracle through Sqoop

Posted by abhijeet gaikwad <ab...@gmail.com>.
+ sqoop user

The answer is in your exception! Check your data, your hitting unique key
violation.

Thanks,
Abhijeet

On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <
Ajit.Shreevastava@hcl.com> wrote:

>  Hi All,****
>
> ** **
>
> I am facing following issue while exporting table from hive to Oracle.
> Importing table from Oracle to Hive and HDFS is working fine. Please let me
> know where I lag. I am pasting my screen output here.****
>
> ** **
>
> ** **
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table
> BTTN_BKP --export-dir  /home/hadoop/user/hive/warehouse/bttn  -P --verbose
> -m 1  --input-fields-terminated-by '\001'*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.****
>
> Please set $HBASE_HOME to the root of your HBase installation.****
>
> 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.****
>
> Enter password:****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory****
>
> 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> jdbc:oracle:thin:@10.99.42.11****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new
> connection cache.****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 1000
> ****
>
> 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> org.apache.sqoop.manager.OracleManager@2abe0e27****
>
> 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query:
> SELECT t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection
> for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER**
> **
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters
> specified. Using regular API for making connection.****
>
> 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next
> query: 1000****
>
> 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM BTTN_BKP t WHERE 1=0****
>
> 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DATA_INST_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SCR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CAT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   WDTH****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   HGHT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SCAN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   KEY_SHFT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   LCLZ_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NU****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_ATVT****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ON_CLIK****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   ENBL_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MKT_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   CRTE_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   UPDT_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_TS****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DEL_USER_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   DLTD_FL****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   MENU_ITEM_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   PRD_CD****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BLM_SET_NA****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   SOUND_FILE_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2,
> DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2,
> KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:12,
> BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2,
> BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2,
> BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93,
> UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12,
> PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2,
> FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2,
> BKGD_CPTN_COLR_PRSD_ID:2,****
>
> 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-1.0.3/libexec/..****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java**
> **
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:*
> ***
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -sourcepath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -d****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:   -classpath****
>
> 13/03/05 19:20:16 DEBUG orm.CompilationManager:
> /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> ****
>
> Note:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java
> uses or overrides a deprecated API.****
>
> Note: Recompile with -Xlint:deprecation for details.****
>
> 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files
> in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e**
> **
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class
> -> BTTN_BKP.class****
>
> 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar***
> *
>
> 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of
> BTTN_BKP****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class
> org.apache.sqoop.mapreduce.ExportInputFormat****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to GMT
> ****
>
> 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released
> connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/sqoop-1.4.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ojdbc6.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/paranamer-2.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-1.5.3.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/commons-io-1.4.jar****
>
> 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1*
> ***
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=184266237****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=184266237
> ****
>
> 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to process
> : 4****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:****
>
> 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn/part-m-00003:0+44751078
> Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;****
>
> 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_0010
> ****
>
> 13/03/05 19:20:20 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:20:36 INFO mapred.JobClient:  map 7% reduce 0%****
>
> 13/03/05 19:20:39 INFO mapred.JobClient:  map 11% reduce 0%****
>
> 13/03/05 19:20:42 INFO mapred.JobClient:  map 16% reduce 0%****
>
> 13/03/05 19:20:45 INFO mapred.JobClient:  map 17% reduce 0%****
>
> 13/03/05 19:20:48 INFO mapred.JobClient:  map 20% reduce 0%****
>
> 13/03/05 19:20:51 INFO mapred.JobClient:  map 27% reduce 0%****
>
> 13/03/05 19:20:54 INFO mapred.JobClient:  map 32% reduce 0%****
>
> 13/03/05 19:20:57 INFO mapred.JobClient:  map 33% reduce 0%****
>
> 13/03/05 19:21:01 INFO mapred.JobClient:  map 38% reduce 0%****
>
> 13/03/05 19:21:04 INFO mapred.JobClient:  map 39% reduce 0%****
>
> 13/03/05 19:21:07 INFO mapred.JobClient:  map 43% reduce 0%****
>
> 13/03/05 19:21:10 INFO mapred.JobClient:  map 44% reduce 0%****
>
> 13/03/05 19:21:13 INFO mapred.JobClient:  map 48% reduce 0%****
>
> 13/03/05 19:21:18 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_0, Status : FAILED****
>
> java.util.NoSuchElementException****
>
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)****
>
>         at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)****
>
>         at BTTN_BKP.parse(BTTN_BKP.java:1148)****
>
>        at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> ** **
>
> 13/03/05 19:21:19 INFO mapred.JobClient:  map 0% reduce 0%****
>
> 13/03/05 19:21:27 INFO mapred.JobClient: Task Id :
> attempt_201303051835_0010_m_000000_1, Status : FAILED****
>
> java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique
> constraint (HDFSUSER.BTTN_BKP_PK) violated****
>
> ** **
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
> ****
>
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:639)
> ****
>
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> ****
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)*
> ***
>
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)*
> ***
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
>
> Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint
> (HDFSUSER.BTTN_BKP_PK) violated****
>
> ** **
>
>         at
> oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10345)
> ****
>
>         at
> oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)
> ****
>
>         at
> org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)
> ****
>
> ** **
>
> 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task
> outputConnection timed out****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Job complete:
> job_201303051835_0010****
>
> 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:   Job Counters****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=77152****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Rack-local map tasks=3****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Launched map tasks=4****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Data-local map tasks=1****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0****
>
> 13/03/05 19:22:09 INFO mapred.JobClient:     Failed map tasks=1****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 110.4837 seconds (0 bytes/sec)****
>
> 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.****
>
> 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job
> failed!****
>
> *[hadoop@NHCLT-PC44-2 sqoop-oper]$*
>
> * *
>
> *Regards,*
>
> *Ajit Kumar Shreevastava*
>
>
>
> ::DISCLAIMER::
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>
> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> E-mail transmission is not guaranteed to be secure or error-free as
> information could be intercepted, corrupted,
> lost, destroyed, arrive late or incomplete, or may contain viruses in
> transmission. The e mail and its contents
> (with or without referred errors) shall therefore not attach any liability
> on the originator or HCL or its affiliates.
> Views or opinions, if any, presented in this email are solely those of the
> author and may not necessarily reflect the
> views or opinions of HCL or its affiliates. Any form of reproduction,
> dissemination, copying, disclosure, modification,
> distribution and / or publication of this message without the prior
> written consent of authorized representative of
> HCL is strictly prohibited. If you have received this email in error
> please delete it and notify the sender immediately.
> Before opening any email and/or attachments, please check them for viruses
> and other defects.
>
>
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>