You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by "anubhav tarar (JIRA)" <ji...@apache.org> on 2018/01/03 07:18:00 UTC

[jira] [Created] (CARBONDATA-1973) User Should not Be able to give the duplicate column name in partition even if its case sensitive

anubhav tarar created CARBONDATA-1973:
-----------------------------------------

             Summary: User Should not Be able to give the duplicate column name in partition even if its case sensitive
                 Key: CARBONDATA-1973
                 URL: https://issues.apache.org/jira/browse/CARBONDATA-1973
             Project: CarbonData
          Issue Type: Bug
          Components: spark-integration
    Affects Versions: 1.3.0
         Environment: spark-2.1
            Reporter: anubhav tarar
            Assignee: anubhav tarar
            Priority: Trivial


1.carbon.sql("CREATE TABLE uniqdata_char2(name char,id int) partitioned by (NAME char)stored by 'carbondata' ")
 name [uniqdata_char2]
18/01/03 12:44:44 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`uniqdata_char2` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/01/03 12:44:44 AUDIT CarbonCreateTableCommand: [anubhav-Vostro-3559][anubhav][Thread-1]Table created with Database name [default] and Table name [uniqdata_char2]
res30: org.apache.spark.sql.DataFrame = []

as we can see table get created successfully

2.try same thing on hive

carbon.sql("CREATE TABLE uniqdata_char2_hive(name char,id int) partitioned by (NAME char) ")
it gives exception

org.apache.spark.sql.AnalysisException: Found duplicate column(s) in table definition of `uniqdata_char2_hive`: name;
  at org.apache.spark.sql.execution.datasources.AnalyzeCreateTable.org$apache$spark$sql$execution$datasources$AnalyzeCreateTable$$failAnalysis(rules.scala:198)

behaviour of carbondata should be similiar to hive



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)