You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "Hudson (JIRA)" <ji...@apache.org> on 2012/05/27 09:18:22 UTC
[jira] [Commented] (SQOOP-483) Allow target dir to be set to a
different name than table name for hive import
[ https://issues.apache.org/jira/browse/SQOOP-483?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13284116#comment-13284116 ]
Hudson commented on SQOOP-483:
------------------------------
Integrated in Sqoop-ant-jdk-1.6 #114 (See [https://builds.apache.org/job/Sqoop-ant-jdk-1.6/114/])
SQOOP-483. Allow target dir to be set to a different name than table name for hive import.
(Cheolsoo Park via Jarek Jarcec Cecho) (Revision 1342998)
Result = SUCCESS
jarcec :
Files :
* /sqoop/trunk/src/java/org/apache/sqoop/hive/HiveImport.java
* /sqoop/trunk/src/java/org/apache/sqoop/hive/TableDefWriter.java
* /sqoop/trunk/src/test/com/cloudera/sqoop/hive/TestTableDefWriter.java
> Allow target dir to be set to a different name than table name for hive import
> ------------------------------------------------------------------------------
>
> Key: SQOOP-483
> URL: https://issues.apache.org/jira/browse/SQOOP-483
> Project: Sqoop
> Issue Type: Improvement
> Reporter: Cheolsoo Park
> Assignee: Cheolsoo Park
> Fix For: 1.4.2-incubating
>
> Attachments: SQOOP-483-2.patch, SQOOP-483.patch
>
>
> Currently, the table name (set by --table) has to be the same as the target dir (set by --target-dir) for hive import. But it would be nice if the target dir could be set to a different name than the table name.
> For example, we like to be able to run something like:
> {code}
> sqoop import ... --table "foo" --target-dir "foo1" --hive-import
> sqoop import ... --table "foo" --target-dir "foo2" --hive-import
> {code}
> Now this generates the following call stack:
> {code}
> 12/05/02 17:11:55 INFO hive.HiveImport: FAILED: Error in semantic analysis: Line 2:17 Invalid path 'hdfs://localhost/user/cheolsoo/foo': No files matching path hdfs://localhost/user/cheolsoo/foo
> 12/05/02 17:11:56 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 10
> at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:349)
> at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:299)
> at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
> at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:394)
> at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:455)
> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> {code}
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira