You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jiaan.geng (JIRA)" <ji...@apache.org> on 2019/02/21 01:57:00 UTC

[jira] [Updated] (SPARK-26643) Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains one typo and one unsuited word.

     [ https://issues.apache.org/jira/browse/SPARK-26643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

jiaan.geng updated SPARK-26643:
-------------------------------
    Summary: Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains one typo and one unsuited word.  (was: Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains two typo.)

> Spark Hive throw an AnalysisException,when set table properties.But this AnalysisException contains one typo and one unsuited word.
> -----------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26643
>                 URL: https://issues.apache.org/jira/browse/SPARK-26643
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0, 2.4.0, 3.0.0
>            Reporter: jiaan.geng
>            Priority: Minor
>
> When I execute a DDL in spark-sql,throwing a AnalysisException as follows:
> {code:java}
> spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
> org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3 into hive metastore as table property keys may not start with 'spark.sql.': [spark.sql.partitionProvider];
> at org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129){code}
> I found the message of this exception contains two typo.
> one is persistent 
> What is the function of the method named verifyTableProperties ? I check the comment of this method ,the comment contains :
> {code:java}
> /**
> * If the given table properties contains datasource properties, throw an exception. We will do
> * this check when create or alter a table, i.e. when we try to write table metadata to Hive
> * metastore.
> */
> {code}
> So I think this analysis exception should change from
> {code:java}
> throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " +
>  s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
>  invalidKeys.mkString("[", ", ", "]")){code}
> to
> {code:java}
> throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into Hive metastore " +
>  s"as table property keys may start with '$SPARK_SQL_PREFIX': " +
>  invalidKeys.mkString("[", ", ", "]")){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org