You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/12/31 07:02:35 UTC

[GitHub] [spark] MaxGekk commented on a change in pull request #26559: [SPARK-29930][SQL] Remove SQL configs declared to be removed in Spark 3.0

MaxGekk commented on a change in pull request #26559: [SPARK-29930][SQL] Remove SQL configs declared to be removed in Spark 3.0
URL: https://github.com/apache/spark/pull/26559#discussion_r362159146
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
 ##########
 @@ -720,14 +720,6 @@ object SQLConf {
     .stringConf
     .createWithDefault("_corrupt_record")
 
-  val FROM_JSON_FORCE_NULLABLE_SCHEMA = buildConf("spark.sql.fromJsonForceNullableSchema")
 
 Review comment:
   Sure, we could throw an exception for 3 configs.  I am just wondering why we silently ignore non-existed SQL configs:
   ```scala
   scala> spark.conf.set("spark.sql.abc", 1)
   
   ```
   How about throwing `AnalysisException` for not existed SQL configs that have the `spark.sql` prefix but don't present in https://github.com/apache/spark/blob/6d64fc2407e5b21a2db59c5213df438c74a31637/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L50 ?
   
   or there are SQL configs that we have to bypass for some reasons?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org