You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "amaliujia (via GitHub)" <gi...@apache.org> on 2023/03/21 06:12:29 UTC

[GitHub] [spark] amaliujia commented on a diff in pull request #40498: [SPARK-42878][CONNECT] The table API in DataFrameReader could also accept options

amaliujia commented on code in PR #40498:
URL: https://github.com/apache/spark/pull/40498#discussion_r1142929171


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:
##########
@@ -183,7 +183,7 @@ class DataFrameReader private[sql] (sparkSession: SparkSession) extends Logging
       dataSourceBuilder.setFormat(source)
       userSpecifiedSchema.foreach(schema => dataSourceBuilder.setSchema(schema.toDDL))
       extraOptions.foreach { case (k, v) =>
-        dataSourceBuilder.putOptions(k, v)
+        builder.getReadBuilder.putOptions(k, v)

Review Comment:
   I found I can only add a meaningful test in server side. On client sides there is no way to verify an option has passed through.
   
   In existing codebase, it is tested because we can do `df.queryExecution.analyzed` then get the complete plan the to fetch the options and verify it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org