You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/17 01:44:45 UTC

[GitHub] [spark] HyukjinKwon commented on a change in pull request #31571: [SPARK-34445][SQL][DOCS] Make `spark.sql.legacy.replaceDatabricksSparkAvro.enabled` as non-internal

HyukjinKwon commented on a change in pull request #31571:
URL: https://github.com/apache/spark/pull/31571#discussion_r577267118



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -2505,7 +2505,6 @@ object SQLConf {
 
   val LEGACY_REPLACE_DATABRICKS_SPARK_AVRO_ENABLED =
     buildConf("spark.sql.legacy.replaceDatabricksSparkAvro.enabled")

Review comment:
       @gengliangwang do we need this conf? We do the same thing in `com.databricks.spark.csv` already by default internally. We could just make it exposed and deprecate this config.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org