You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/20 03:55:58 UTC

[GitHub] beliefer commented on a change in pull request #23574: [SPARK-26643][SQL] Fix incorrect analysis exception about set table properties.

beliefer commented on a change in pull request #23574: [SPARK-26643][SQL] Fix incorrect analysis exception about set table properties.
URL: https://github.com/apache/spark/pull/23574#discussion_r258325758
 
 

 ##########
 File path: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
 ##########
 @@ -129,7 +129,7 @@ private[spark] class HiveExternalCatalog(conf: SparkConf, hadoopConf: Configurat
     val invalidKeys = table.properties.keys.filter(_.startsWith(SPARK_SQL_PREFIX))
     if (invalidKeys.nonEmpty) {
       throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} into hive metastore " +
-        s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
+        s"as table property keys may start with '$SPARK_SQL_PREFIX': " +
 
 Review comment:
   Yes, only internal keys allowed start with this prefix and user-specified keys not allowed with this prefix.If user-specified keys start with this prefix, throw the AnalysisException so that tell the reason.
   The reason is user-specified keys start with this prefix.
   The reason isn't user-specified keys not start with this prefix.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org