You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "Jarek Jarcec Cecho (JIRA)" <ji...@apache.org> on 2012/08/20 15:32:38 UTC

[jira] [Resolved] (SQOOP-583) Zero exit code on Exception in sqoop import

     [ https://issues.apache.org/jira/browse/SQOOP-583?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jarek Jarcec Cecho resolved SQOOP-583.
--------------------------------------

    Resolution: Not A Problem
      Assignee: Jarek Jarcec Cecho

Hi Ruslan,
I wasn't able to reproduce your issue with sqoop 1.4.1 - more precisely:

{code}

root@ubuntu-cdh4:~# sqoop import --connect jdbc:mysql://172.16.252.1/sqoop --username XXXXX --password XXXXX --table pokus -m 1
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
12/08/20 06:27:58 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/08/20 06:27:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/08/20 06:27:59 INFO tool.CodeGenTool: Beginning code generation
12/08/20 06:27:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1
12/08/20 06:27:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1
12/08/20 06:27:59 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
Note: /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/08/20 06:28:00 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.java to /root/./pokus.java
org.apache.commons.io.FileExistsException: Destination '/root/./pokus.java' already exists
        at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
        at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
12/08/20 06:28:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.jar
12/08/20 06:28:00 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/08/20 06:28:00 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/08/20 06:28:00 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/08/20 06:28:00 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/08/20 06:28:00 INFO mapreduce.ImportJobBase: Beginning import of pokus
12/08/20 06:28:02 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
12/08/20 06:28:02 INFO mapred.JobClient: Cleaning up the staging area hdfs://ubuntu-cdh4/tmp/hadoop-mapred/mapred/staging/root/.staging/job_201208192257_0003
12/08/20 06:28:02 ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists
12/08/20 06:28:02 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:883)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:844)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:844)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:481)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:511)
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201)
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
        at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:100)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

root@ubuntu-cdh4:~# echo $?
1
{code}

Could you please upgrade your sqoop version to the most recent one?

Jarcec
                
> Zero exit code on Exception in sqoop import
> -------------------------------------------
>
>                 Key: SQOOP-583
>                 URL: https://issues.apache.org/jira/browse/SQOOP-583
>             Project: Sqoop
>          Issue Type: Bug
>    Affects Versions: 1.3.0
>            Reporter: Ruslan Al-Fakikh
>            Assignee: Jarek Jarcec Cecho
>
> I am getting zero exit code when there is a real exception when
> running Sqoop Import. The correct exit code (whether it is error or
> not) is important for our scheduling system to notify us of any
> errors. Should I file a jira issue for this bug?
> Here is what I get:
> For a regular sqoop command:
> {code}
> [cloudera@localhost workhive]$ sqoop
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Try 'sqoop help' for usage.
> [cloudera@localhost workhive]$ echo $?
> 1
> {code}
> So, the error code is correct here
> But for the import:
> {code}
> [cloudera@localhost workhive]$ sqoop import --username
> username--password password--hive-import --table ExternalPublisher
> --connect jdbc:sqlserver://url:port;databaseName=DBName;
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> 12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
> the command-line is insecure. Consider using -P instead.
> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
> delimiters for output. You can override
> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
> --fields-terminated-by, etc.
> 12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
> Microsoft's SQL Server - Hadoop Connector
> 12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
> 12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
> 12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
> at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
> 12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
> to /home/cloudera/workhive/./ExternalPublisher.java
> java.io.IOException: Destination
> '/home/cloudera/workhive/./ExternalPublisher.java' already exists
>         at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
>         at com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
>         at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
>         at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
>         at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> 12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
> 12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
> ExternalPublisher
> 12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
> 12/08/17 20:52:48 ERROR security.UserGroupInformation:
> PriviledgedActionException as:cloudera (auth:SIMPLE)
> cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
> directory ExternalPublisher already exists
> 12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
> running import job:
> org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
> ExternalPublisher already exists
>         at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
>         at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
>         at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
>         at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
>         at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
>         at com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
>         at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
>         at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> [cloudera@localhost workhive]$ echo $?
> 0
> {code}
> The error code shows success here, which is undesirable. And I am not
> interested in why I get FileAlreadyExistsException, I know how to
> handle it. The correct error code is more important for maintenance.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira