You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Ruslan Al-Fakikh <me...@gmail.com> on 2012/08/20 11:02:33 UTC
Zero exit code on Exception in sqoop import
Hello all,
I am getting zero exit code when there is a real exception when
running Sqoop Import. The correct exit code (whether it is error or
not) is important for our scheduling system to notify us of any
errors. Should I file a jira issue for this bug?
Here is what I get:
For a regular sqoop command:
[cloudera@localhost workhive]$ sqoop
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Try 'sqoop help' for usage.
[cloudera@localhost workhive]$ echo $?
1
So, the error code is correct here
But for the import:
[cloudera@localhost workhive]$ sqoop import --username
username--password password--hive-import --table ExternalPublisher
--connect jdbc:sqlserver://url:port;databaseName=DBName;
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
the command-line is insecure. Consider using -P instead.
12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
delimiters for output. You can override
12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
--fields-terminated-by, etc.
12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
Microsoft's SQL Server - Hadoop Connector
12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
/tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
to /home/cloudera/workhive/./ExternalPublisher.java
java.io.IOException: Destination
'/home/cloudera/workhive/./ExternalPublisher.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
at com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
ExternalPublisher
12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
12/08/17 20:52:48 ERROR security.UserGroupInformation:
PriviledgedActionException as:cloudera (auth:SIMPLE)
cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
directory ExternalPublisher already exists
12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
running import job:
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
ExternalPublisher already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
at com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
[cloudera@localhost workhive]$ echo $?
0
The error code shows success here, which is undesirable. And I am not
interested in why I get FileAlreadyExistsException, I know how to
handle it. The correct error code is more important for maintenance.
Thank in advance.
Re: Zero exit code on Exception in sqoop import
Posted by Ruslan Al-Fakikh <me...@gmail.com>.
Thanks. Jira issue created:
https://issues.apache.org/jira/browse/SQOOP-583
On Mon, Aug 20, 2012 at 1:10 PM, Jarek Jarcec Cecho <ja...@apache.org> wrote:
> Hi sir,
> I would recommend creating a JIRA on http://issues.apache.org/jira/browse/SQOOP
>
> Jarcec
>
> On Mon, Aug 20, 2012 at 01:02:33PM +0400, Ruslan Al-Fakikh wrote:
>> Hello all,
>>
>> I am getting zero exit code when there is a real exception when
>> running Sqoop Import. The correct exit code (whether it is error or
>> not) is important for our scheduling system to notify us of any
>> errors. Should I file a jira issue for this bug?
>> Here is what I get:
>>
>> For a regular sqoop command:
>>
>> [cloudera@localhost workhive]$ sqoop
>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>> Please set $HBASE_HOME to the root of your HBase installation.
>> Try 'sqoop help' for usage.
>> [cloudera@localhost workhive]$ echo $?
>> 1
>>
>> So, the error code is correct here
>>
>> But for the import:
>>
>> [cloudera@localhost workhive]$ sqoop import --username
>> username--password password--hive-import --table ExternalPublisher
>> --connect jdbc:sqlserver://url:port;databaseName=DBName;
>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>> Please set $HBASE_HOME to the root of your HBase installation.
>> 12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
>> the command-line is insecure. Consider using -P instead.
>> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
>> delimiters for output. You can override
>> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
>> --fields-terminated-by, etc.
>> 12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
>> Microsoft's SQL Server - Hadoop Connector
>> 12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
>> 12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
>> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
>> SELECT TOP 1 * FROM [ExternalPublisher]
>> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
>> SELECT TOP 1 * FROM [ExternalPublisher]
>> 12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
>> 12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
>> at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
>> 12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
>> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
>> to /home/cloudera/workhive/./ExternalPublisher.java
>> java.io.IOException: Destination
>> '/home/cloudera/workhive/./ExternalPublisher.java' already exists
>> at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
>> at com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
>> at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
>> at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
>> at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
>> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
>> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>> at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
>> 12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
>> 12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
>> ExternalPublisher
>> 12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
>> SELECT TOP 1 * FROM [ExternalPublisher]
>> 12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
>> hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
>> 12/08/17 20:52:48 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:cloudera (auth:SIMPLE)
>> cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
>> directory ExternalPublisher already exists
>> 12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
>> running import job:
>> org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
>> ExternalPublisher already exists
>> at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
>> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
>> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:396)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
>> at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
>> at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
>> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
>> at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
>> at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
>> at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
>> at com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
>> at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
>> at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
>> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
>> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>> at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
>>
>> [cloudera@localhost workhive]$ echo $?
>> 0
>>
>> The error code shows success here, which is undesirable. And I am not
>> interested in why I get FileAlreadyExistsException, I know how to
>> handle it. The correct error code is more important for maintenance.
>>
>> Thank in advance.
Re: Zero exit code on Exception in sqoop import
Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi sir,
I would recommend creating a JIRA on http://issues.apache.org/jira/browse/SQOOP
Jarcec
On Mon, Aug 20, 2012 at 01:02:33PM +0400, Ruslan Al-Fakikh wrote:
> Hello all,
>
> I am getting zero exit code when there is a real exception when
> running Sqoop Import. The correct exit code (whether it is error or
> not) is important for our scheduling system to notify us of any
> errors. Should I file a jira issue for this bug?
> Here is what I get:
>
> For a regular sqoop command:
>
> [cloudera@localhost workhive]$ sqoop
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Try 'sqoop help' for usage.
> [cloudera@localhost workhive]$ echo $?
> 1
>
> So, the error code is correct here
>
> But for the import:
>
> [cloudera@localhost workhive]$ sqoop import --username
> username--password password--hive-import --table ExternalPublisher
> --connect jdbc:sqlserver://url:port;databaseName=DBName;
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> 12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
> the command-line is insecure. Consider using -P instead.
> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
> delimiters for output. You can override
> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
> --fields-terminated-by, etc.
> 12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
> Microsoft's SQL Server - Hadoop Connector
> 12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
> 12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
> 12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
> at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
> 12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
> to /home/cloudera/workhive/./ExternalPublisher.java
> java.io.IOException: Destination
> '/home/cloudera/workhive/./ExternalPublisher.java' already exists
> at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
> at com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
> at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> 12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
> 12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
> ExternalPublisher
> 12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
> 12/08/17 20:52:48 ERROR security.UserGroupInformation:
> PriviledgedActionException as:cloudera (auth:SIMPLE)
> cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
> directory ExternalPublisher already exists
> 12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
> running import job:
> org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
> ExternalPublisher already exists
> at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
> at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
> at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
> at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
> at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
> at com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
> at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
> at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
>
> [cloudera@localhost workhive]$ echo $?
> 0
>
> The error code shows success here, which is undesirable. And I am not
> interested in why I get FileAlreadyExistsException, I know how to
> handle it. The correct error code is more important for maintenance.
>
> Thank in advance.