You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "zhenlineo (via GitHub)" <gi...@apache.org> on 2023/02/27 23:10:10 UTC
[GitHub] [spark] zhenlineo commented on a diff in pull request #40160: [SPARK-41725][CONNECT] Eager Execution of DF.sql()
zhenlineo commented on code in PR #40160:
URL: https://github.com/apache/spark/pull/40160#discussion_r1119407376
##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/PlanGenerationTestSuite.scala:
##########
@@ -230,14 +230,6 @@ class PlanGenerationTestSuite
private def temporals = createLocalRelation(temporalsSchemaString)
/* Spark Session API */
- test("sql") {
Review Comment:
Pls also delete the corresponding auto generated files too. Each test should have 3 files related to it. `.json`, `.bin`, `.explain`
##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -122,8 +122,18 @@ class SparkSession(
@Experimental
def sql(sqlText: String, args: java.util.Map[String, String]): DataFrame = newDataFrame {
builder =>
- builder
- .setSql(proto.SQL.newBuilder().setQuery(sqlText).putAllArgs(args))
+ // Send the SQL once to the server and then check the output.
+ val cmd = newCommand(b =>
+ b.setSqlCommand(proto.SqlCommand.newBuilder().setSql(sqlText).putAllArgs(args)))
+ val plan = proto.Plan.newBuilder().setCommand(cmd)
+ val responseIter = client.execute(plan.build())
+
+ val response = responseIter.asScala
+ .find(_.hasSqlCommandResult)
+ .getOrElse(throw new RuntimeException("SQLCommandResult must be present"))
Review Comment:
Maybe `require(sqlCommandResult)` or a `IllegalArgumentException`?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org