You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "WweiL (via GitHub)" <gi...@apache.org> on 2023/04/13 22:46:57 UTC

[GitHub] [spark] WweiL commented on a diff in pull request #40785: [SPARK-42960] Add await_termination() and exception() API for Streaming Query

WweiL commented on code in PR #40785:
URL: https://github.com/apache/spark/pull/40785#discussion_r1166096130


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##########
@@ -2221,6 +2221,31 @@ class SparkConnectPlanner(val session: SparkSession) {
           .build()
         respBuilder.setExplain(explain)
 
+      case StreamingQueryCommand.CommandCase.EXCEPTION =>
+        val result = query.exception
+        val exception = result match {
+          case Some(e) =>
+            StreamingQueryCommandResult.ExceptionResult
+              .newBuilder()
+              .setHasException(true)
+              .setErrorMessage(SparkConnectService.extractErrorMessage(e))
+              .build()
+          case None =>
+            StreamingQueryCommandResult.ExceptionResult
+              .newBuilder()
+              .setHasException(false)
+              .build()
+        }
+        respBuilder.setException(exception)
+
+      case StreamingQueryCommand.CommandCase.AWAIT_TERMINATION =>
+        val terminated = query.awaitTermination(command.getAwaitTermination.getTimeoutMs)

Review Comment:
   In `query.py` the `_execute_await_termination_cmd` sets the default timeout to be 10 ms so I didn't put default value here. I think we could discuss here



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org