You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/15 15:01:16 UTC

[GitHub] [spark] marmbrus commented on a change in pull request #24365: [SPARK-27453] Pass partitionBy as options in DataFrameWriter

marmbrus commented on a change in pull request #24365: [SPARK-27453] Pass partitionBy as options in DataFrameWriter
URL: https://github.com/apache/spark/pull/24365#discussion_r275404752
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
 ##########
 @@ -1687,6 +1687,15 @@ object SQLConf {
       .booleanConf
       .createWithDefault(false)
 
+  val LEGACY_PASS_PARTITION_BY_AS_OPTIONS =
+    buildConf("spark.sql.legacy.sources.write.passPartitionByAsOptions")
+      .internal()
+      .doc("Whether to pass the partitionBy columns as options in DataFrameWriter." +
+        " Data source V1 now silently drops partitionBy columns for non-file-format sources;" +
+        " turning the flag on provides a way for these sources to see these partitionBy columns.")
+      .booleanConf
+      .createWithDefault(true)
 
 Review comment:
   I went back and forth on what to recommend here.  I think we need to balance the fact that this is a behavior change with the confusion that a user will experience when options they specify are silently dropped.
   
   I recommended going with `true` because: a) I don't know any V1 sources that validate options where this change would break an existing program and b) the only time behavior changes is when someone is specifying a (silently dropped) partitionBy clause.
   
   Thoughts?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org