You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Sushanth Sowmyan (JIRA)" <ji...@apache.org> on 2014/07/22 21:12:40 UTC
[jira] [Created] (HIVE-7472) CLONE - Import fails for tables
created with default text, sequence and orc file formats using HCatalog API
Sushanth Sowmyan created HIVE-7472:
--------------------------------------
Summary: CLONE - Import fails for tables created with default text, sequence and orc file formats using HCatalog API
Key: HIVE-7472
URL: https://issues.apache.org/jira/browse/HIVE-7472
Project: Hive
Issue Type: Bug
Components: HCatalog
Affects Versions: 0.11.0
Reporter: Sushanth Sowmyan
Assignee: Sushanth Sowmyan
Fix For: 0.13.0
A table was created using HCatalog API with out specifying the file format, it defaults to:
{code}
fileFormat=TextFile, inputformat=org.apache.hadoop.mapred.TextInputFormat, outputformat=org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
{code}
But, when hive fetches the table from the metastore, it strangely replaces the output format with org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
and the comparison between source and target table fails.
The code in org.apache.hadoop.hive.ql.parse.ImportSemanticAnalyzer#checkTable does a string comparison of classes and fails.
{code}
// check IF/OF/Serde
String existingifc = table.getInputFormatClass().getName();
String importedifc = tableDesc.getInputFormat();
String existingofc = table.getOutputFormatClass().getName();
String importedofc = tableDesc.getOutputFormat();
if ((!existingifc.equals(importedifc))
|| (!existingofc.equals(importedofc))) {
throw new SemanticException(
ErrorMsg.INCOMPATIBLE_SCHEMA
.getMsg(" Table inputformat/outputformats do not match"));
}
{code}
This only affects tables with text and sequence file formats but not rc or orc.
--
This message was sent by Atlassian JIRA
(v6.2#6252)