You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "vicennial (via GitHub)" <gi...@apache.org> on 2023/04/05 11:16:16 UTC

[GitHub] [spark] vicennial opened a new pull request, #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

vicennial opened a new pull request, #40675:
URL: https://github.com/apache/spark/pull/40675

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   
   This PR introduces the concept of a `ClassFinder` that is able to scrape the REPL output (either file-based or in-memory based) for generated class files.  The `ClassFinder` is registered during initialization of the REPL and aids in uploading the generated class files as artifacts to the Spark Connect server.
   
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   
   To run UDFs which are defined on the client side REPL, we require a mechanism that can find the local REPL classfiles and then utilise the mechanism from https://issues.apache.org/jira/browse/SPARK-42653 to transfer them to the server as artifacts.  
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   
   Yes, users can now run UDFs on the default (ammonite) REPL with spark connect.
   
   Input (in REPL):
   ```
   class A(x: Int) { def get = x * 5 + 19 }
   def dummyUdf(x: Int): Int = new A(x).get
   val myUdf = udf(dummyUdf _)
   spark.range(5).select(myUdf(col("id"))).as[Int].collect()
   ```
   
   Output:
   ```
   Array[Int] = Array(19, 24, 29, 34, 39)
   ```
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   If benchmark tests were added, please run the benchmarks in GitHub Actions for the consistent environment, and the instructions could accord to: https://spark.apache.org/developer-tools.html#github-workflow-benchmarks.
   -->
   
   Unit tests + E2E tests.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1166754397


##########
.github/workflows/build_and_test.yml:
##########
@@ -247,7 +247,10 @@ jobs:
     # Run the tests.
     - name: Run tests
       env: ${{ fromJSON(inputs.envs) }}
+      shell: 'script -q -e -c "bash {0}"'
       run: |
+        # Fix for TTY related issues when laaunching Ammonite REPL in tests.

Review Comment:
   Good eye, fixed!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1158719645


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########


Review Comment:
   Why this change?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515558035

   > @LuciferYang which GitHub job failed? The link seems stale now so I can't see. It is a scheduled build?
   
   I found this issue from GA task of user pr
   
   The last failed build on https://github.com/apache/spark/pull/40783  also had this issue, I don't know if @rangadi  can provide a new link


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1169904255


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/connect/client/SparkConnectClient.scala:
##########
@@ -58,10 +58,13 @@ private[sql] class SparkConnectClient(
    * @return
    *   A [[proto.AnalyzePlanResponse]] from the Spark Connect server.
    */
-  def analyze(request: proto.AnalyzePlanRequest): proto.AnalyzePlanResponse =
+  def analyze(request: proto.AnalyzePlanRequest): proto.AnalyzePlanResponse = {
+    artifactManager.uploadAllClassFileArtifacts()

Review Comment:
   > Does this method cache previous scanned artifacts?
   
   Good eye but not at the moment, no. However, the plan is to implement this later as an optimisation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1160073701


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {

Review Comment:
   Yes. Repl generated code. There are a lot of weird things possible, so I think we need more depth here. We can do this as a follow-up though.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1159690669


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -488,6 +488,14 @@ class SparkSession private[sql] (
   @scala.annotation.varargs
   def addArtifacts(uri: URI*): Unit = client.addArtifacts(uri)
 
+  /**
+   * Register a [[ClassFinder]] for dynamically generated classes.
+   *
+   * @since 3.4.0
+   */
+  @Experimental

Review Comment:
   > Also mark @Private?
   Wrt the above comment, we can't since we register the class finder as part of the predef code. 
   I'm not too well versed with Ammonite internals, I wonder if we can get the REPL session without having to go through predef code.



##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -488,6 +488,14 @@ class SparkSession private[sql] (
   @scala.annotation.varargs
   def addArtifacts(uri: URI*): Unit = client.addArtifacts(uri)
 
+  /**
+   * Register a [[ClassFinder]] for dynamically generated classes.
+   *
+   * @since 3.4.0
+   */
+  @Experimental

Review Comment:
   > Also mark @Private?
   
   Wrt the above comment, we can't since we register the class finder as part of the predef code. 
   I'm not too well versed with Ammonite internals, I wonder if we can get the REPL session without having to go through predef code.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1159377092


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+
+    // Run simple query to test REPL
+    val output = runCommandsInShell("spark.sql(\"select 1\").collect()")

Review Comment:
   Ack. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1178635161


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,128 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+  }
+
+  override def afterAll(): Unit = {
+    executorService.shutdownNow()
+    super.afterAll()
+  }
+
+  def runCommandsInShell(input: String): String = {
+    require(input.nonEmpty)
+    // Pad the input with a semaphore release so that we know when the execution of the provided
+    // input is complete.
+    val paddedInput = input + '\n' + "semaphore.release()\n"
+    testSuiteOut.write(paddedInput.getBytes)
+    testSuiteOut.flush()
+    if (!semaphore.tryAcquire(TIMEOUT_SECONDS, TimeUnit.SECONDS)) {
+      val failOut = getCleanString(ammoniteOut)
+      val errOut = getCleanString(errorStream)
+      val errorString =
+        s"""
+          |REPL Timed out while running command: $input
+          |Console output: $failOut
+          |Error output: $errOut
+          |""".stripMargin
+      throw new RuntimeException(errorString)
+    }
+    getCleanString(ammoniteOut)
+  }
+
+  def assertContains(message: String, output: String): Unit = {
+    val isContain = output.contains(message)
+    assert(isContain, "Ammonite output did not contain '" + message + "':\n" + output)
+  }
+
+  test("Simple query") {

Review Comment:
   https://github.com/apache/spark/actions/runs/4813779407
   
   Yesterday's test passed, thanks ~
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1159370697


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/SimpleSparkConnectService.scala:
##########
@@ -38,7 +38,7 @@ private[sql] object SimpleSparkConnectService {
   private val stopCommand = "q"
 
   def main(args: Array[String]): Unit = {
-    val conf = new SparkConf()
+    val conf = new SparkConf().set("connect.test", "true")

Review Comment:
   I tried this first but it breaks testing because it attempts to launch multiple servers (IIRC I was getting hit by port occupied errors)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1159376657


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {

Review Comment:
   To clarify, you mean UDF specific tests right?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515558746

   @HyukjinKwon https://github.com/rangadi/spark/actions/runs/4737137341/jobs/8409588363 this one


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1159374961


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -488,6 +488,14 @@ class SparkSession private[sql] (
   @scala.annotation.varargs
   def addArtifacts(uri: URI*): Unit = client.addArtifacts(uri)
 
+  /**
+   * Register a [[ClassFinder]] for dynamically generated classes.
+   *
+   * @since 3.4.0
+   */
+  @Experimental

Review Comment:
   > There is part of me that thinks we should be doing this as part of building the session, instead of adding it while running.
   
   I tried that initially but the `repl.sess` that we pass over in `spark.registerClassFinder(new AmmoniteClassFinder(repl.sess))` is only available once the REPL is instantiated
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1164025219


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/SimpleSparkConnectService.scala:
##########
@@ -38,7 +38,7 @@ private[sql] object SimpleSparkConnectService {
   private val stopCommand = "q"
 
   def main(args: Array[String]): Unit = {
-    val conf = new SparkConf()
+    val conf = new SparkConf().set("connect.test", "true")

Review Comment:
   Ah, thanks for the info, will try this.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515507340

   Is this just being flaky? All GA jobs share the same thing (that you fixed)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515545618

   @LuciferYang which GitHub job failed? The link seems stale now so I can't see. It is a scheduled build?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell closed pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell closed pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts  
URL: https://github.com/apache/spark/pull/40675


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1158714617


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+
+    // Run simple query to test REPL
+    val output = runCommandsInShell("spark.sql(\"select 1\").collect()")

Review Comment:
   use a multi-line string that is easier to read. `"""spark.sql("select 1").collect()"""`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1158715337


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+
+    // Run simple query to test REPL
+    val output = runCommandsInShell("spark.sql(\"select 1\").collect()")

Review Comment:
   BTW just do this in a test. It is weird to have this in the beforeAll.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1158718071


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {

Review Comment:
   This should be beefed up quite a bit. @rednaxelafx @cloud-fan do we have a comprehensive set of tests somewhere?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhenlineo commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "zhenlineo (via GitHub)" <gi...@apache.org>.
zhenlineo commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515510014

   @HyukjinKwon I would not think it is flaky. It is probably the GA builds also missing the TTY setup in the build steps. So we need to copy the following changes to the steps:
   ```
         run: |
           # Fix for TTY related issues when launching the Ammonite REPL in tests.
           export TERM=vt100 && script -qfc 'echo exit | amm -s' && rm typescript
   ```
   https://github.com/apache/spark/pull/40675/files#diff-48c0ee97c53013d18d6bbae44648f7fab9af2e0bf5b0dc1ca761e18ec5c478f2R250-R253


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1512936430

   Find an error related to ReplE2ESuite on [GA test task](https://pipelines.actions.githubusercontent.com/serviceHosts/c184045e-b556-4e78-b8ef-fb37b2eda9a3/_apis/pipelines/1/runs/69161/signedlogcontent/23?urlExpires=2023-04-18T10%3A18%3A24.5673026Z&urlSigningMethod=HMACV1&urlSignature=IZ4kWbB8mtkvxvyxojX3%2FxIz43j%2FVRKl7Ghp2Y52nnE%3D):
   
   ```
   023-04-18T03:29:20.8938544Z [info] ReplE2ESuite:
   2023-04-18T03:29:24.5685770Z sh: 1: cannot open /dev/tty: No such device or address
   ...
   2023-04-18T03:29:25.5551653Z sh: 1: cannot open /dev/tty: No such device or address
   2023-04-18T03:29:25.5631256Z sh: 1: cannot create /dev/tty: No such device or address
   ...
   2023-04-18T03:29:43.5473148Z [info]   java.lang.RuntimeException: REPL Timed out while running command: 
   2023-04-18T03:29:43.5473697Z [info] class A(x: Int) { def get = x * 5 + 19 }
   2023-04-18T03:29:43.5474147Z [info] def dummyUdf(x: Int): Int = new A(x).get
   2023-04-18T03:29:43.5474578Z [info] val myUdf = udf(dummyUdf _)
   2023-04-18T03:29:43.5480701Z [info] spark.range(5).select(myUdf(col("id"))).as[Int].collect()
   2023-04-18T03:29:43.5481161Z [info]       
   2023-04-18T03:29:43.5481539Z [info] Console output: 
   2023-04-18T03:29:43.5482081Z [info] Spark session available as 'spark'.
   2023-04-18T03:29:43.5484862Z [info]    _____                  __      ______                            __
   2023-04-18T03:29:43.5488022Z [info]   / ___/____  ____ ______/ /__   / ____/___  ____  ____  ___  _____/ /_
   2023-04-18T03:29:43.5491144Z [info]   \__ \/ __ \/ __ `/ ___/ //_/  / /   / __ \/ __ \/ __ \/ _ \/ ___/ __/
   2023-04-18T03:29:43.5494244Z [info]  ___/ / /_/ / /_/ / /  / ,<    / /___/ /_/ / / / / / / /  __/ /__/ /_
   2023-04-18T03:29:43.5497421Z [info] /____/ .___/\__,_/_/  /_/|_|   \____/\____/_/ /_/_/ /_/\___/\___/\__/
   2023-04-18T03:29:43.5500566Z [info]     /_/
   2023-04-18T03:29:43.5502973Z sh: 1: cannot open /dev/tty: No such device or address
   ...
   2023-04-18T03:29:43.7154910Z [info] Error output: Compiling (synthetic)/ammonite/predef/ArgsPredef.sc
   2023-04-18T03:29:43.7155688Z [info] Compiling /home/runner/work/spark/spark/connector/connect/client/jvm/(console)
   2023-04-18T03:29:43.7156273Z [info] java.lang.RuntimeException: Nonzero exit value: 2
   2023-04-18T03:29:43.7156688Z [info]   scala.sys.package$.error(package.scala:30)
   2023-04-18T03:29:43.7157121Z [info]   scala.sys.process.ProcessBuilderImpl$AbstractBuilder.slurp(ProcessBuilderImpl.scala:138)
   2023-04-18T03:29:43.7157538Z [info]   scala.sys.process.ProcessBuilderImpl$AbstractBuilder.$bang$bang(ProcessBuilderImpl.scala:108)
   2023-04-18T03:29:43.7158104Z [info]   ammonite.terminal.TTY$.stty(Utils.scala:103)
   2023-04-18T03:29:43.7158794Z [info]   ammonite.terminal.TTY$.withSttyOverride(Utils.scala:114)
   2023-04-18T03:29:43.7160239Z [info]   ammonite.terminal.Terminal$.readLine(Terminal.scala:41)
   2023-04-18T03:29:43.7162012Z [info]   ammonite.repl.AmmoniteFrontEnd.readLine(AmmoniteFrontEnd.scala:133)
   2023-04-18T03:29:43.7163249Z [info]   ammonite.repl.AmmoniteFrontEnd.action(AmmoniteFrontEnd.scala:28)
   2023-04-18T03:29:43.7163556Z [info]   ammonite.repl.Repl.$anonfun$action$4(Repl.scala:194)
   2023-04-18T03:29:43.7163863Z [info]   ammonite.repl.Scoped.$anonfun$flatMap$1(Signaller.scala:45)
   2023-04-18T03:29:43.7164158Z [info]   ammonite.repl.Signaller.apply(Signaller.scala:28)
   2023-04-18T03:29:43.7171303Z [info]   ammonite.repl.Scoped.flatMap(Signaller.scala:45)
   2023-04-18T03:29:43.7172151Z [info]   ammonite.repl.Scoped.flatMap$(Signaller.scala:45)
   2023-04-18T03:29:43.7172894Z [info]   ammonite.repl.Signaller.flatMap(Signaller.scala:16)
   2023-04-18T03:29:43.7174248Z [info]   ammonite.repl.Repl.$anonfun$action$2(Repl.scala:178)
   2023-04-18T03:29:43.7176993Z [info]   ammonite.util.Catching.flatMap(Res.scala:115)
   2023-04-18T03:29:43.7177811Z [info]   ammonite.repl.Repl.action(Repl.scala:170)
   2023-04-18T03:29:43.7178715Z [info]   ammonite.repl.Repl.loop$1(Repl.scala:212)
   2023-04-18T03:29:43.7179009Z [info]   ammonite.repl.Repl.run(Repl.scala:227)
   2023-04-18T03:29:43.7179272Z [info]   ammonite.Main.$anonfun$run$1(Main.scala:236)
   2023-04-18T03:29:43.7179543Z [info]   scala.Option.getOrElse(Option.scala:189)
   2023-04-18T03:29:43.7179795Z [info]   ammonite.Main.run(Main.scala:224)
   2023-04-18T03:29:43.7184102Z [info]   org.apache.spark.sql.application.ConnectRepl$.doMain(ConnectRepl.scala:100)
   2023-04-18T03:29:43.7185205Z [info]   org.apache.spark.sql.application.ReplE2ESuite$$anon$1.run(ReplE2ESuite.scala:59)
   2023-04-18T03:29:43.7185569Z [info]   java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
   2023-04-18T03:29:43.7185885Z [info]   java.util.concurrent.FutureTask.run(FutureTask.java:266)
   2023-04-18T03:29:43.7191750Z [info]   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   2023-04-18T03:29:43.7192249Z [info]   java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   2023-04-18T03:29:43.7192662Z [info]   java.lang.Thread.run(Thread.java:750)
   ```
   
   I am currently unsure why this error occurred
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] pan3793 commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "pan3793 (via GitHub)" <gi...@apache.org>.
pan3793 commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515619326

   > @HyukjinKwon I would not think it is flaky. It is probably the GA builds also missing the TTY setup in the build steps. So we need to copy the following changes to the steps:
   > 
   > ```
   >       run: |
   >         # Fix for TTY related issues when launching the Ammonite REPL in tests.
   >         export TERM=vt100 && script -qfc 'echo exit | amm -s' && rm typescript
   > ```
   > 
   > https://github.com/apache/spark/pull/40675/files#diff-48c0ee97c53013d18d6bbae44648f7fab9af2e0bf5b0dc1ca761e18ec5c478f2R250-R253
   
   @zhenlineo is there some reference about such magic command? better to briefly explain the mechanism or leave a link in comments


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1174696470


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,128 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+  }
+
+  override def afterAll(): Unit = {
+    executorService.shutdownNow()
+    super.afterAll()
+  }
+
+  def runCommandsInShell(input: String): String = {
+    require(input.nonEmpty)
+    // Pad the input with a semaphore release so that we know when the execution of the provided
+    // input is complete.
+    val paddedInput = input + '\n' + "semaphore.release()\n"
+    testSuiteOut.write(paddedInput.getBytes)
+    testSuiteOut.flush()
+    if (!semaphore.tryAcquire(TIMEOUT_SECONDS, TimeUnit.SECONDS)) {
+      val failOut = getCleanString(ammoniteOut)
+      val errOut = getCleanString(errorStream)
+      val errorString =
+        s"""
+          |REPL Timed out while running command: $input
+          |Console output: $failOut
+          |Error output: $errOut
+          |""".stripMargin
+      throw new RuntimeException(errorString)
+    }
+    getCleanString(ammoniteOut)
+  }
+
+  def assertContains(message: String, output: String): Unit = {
+    val isContain = output.contains(message)
+    assert(isContain, "Ammonite output did not contain '" + message + "':\n" + output)
+  }
+
+  test("Simple query") {

Review Comment:
   This test consistently fails with JDK 17:
   
   ```
   [info] ReplE2ESuite:
   [info] - Simple query *** FAILED *** (10 seconds, 4 milliseconds)
   [info]   java.lang.RuntimeException: REPL Timed out while running command: 
   [info] spark.sql("select 1").collect()
   [info]       
   [info] Console output: 
   [info] Error output: Compiling (synthetic)/ammonite/predef/ArgsPredef.sc
   [info]   at org.apache.spark.sql.application.ReplE2ESuite.runCommandsInShell(ReplE2ESuite.scala:87)
   [info]   at org.apache.spark.sql.application.ReplE2ESuite.$anonfun$new$1(ReplE2ESuite.scala:102)
   [info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   [info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
   [info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
   [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
   [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
   [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
   [info]   at org.scalatest.TestSuite.withFixture(TestSuite.scala:196)
   [info]   at org.scalatest.TestSuite.withFixture$(TestSuite.scala:195)
   [info]   at org.scalatest.funsuite.AnyFunSuite.withFixture(AnyFunSuite.scala:1564)
   [info]   at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
   ```
   https://github.com/apache/spark/actions/runs/4780630672/jobs/8498505928#step:9:4647
   https://github.com/apache/spark/actions/runs/4774942961/jobs/8488946907
   https://github.com/apache/spark/actions/runs/4769162286/jobs/8479293802
   https://github.com/apache/spark/actions/runs/4759278349/jobs/8458399201
   https://github.com/apache/spark/actions/runs/4748319019/jobs/8434392414
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515645182

   > > Hmm... GA should merge the current pr to the latest master before testing, right?
   > 
   > Yes except the workflow file. For the changes in workflow, they have to manually update to the latest master branch.
   
   Got it ~


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1174917132


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,128 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+  }
+
+  override def afterAll(): Unit = {
+    executorService.shutdownNow()
+    super.afterAll()
+  }
+
+  def runCommandsInShell(input: String): String = {
+    require(input.nonEmpty)
+    // Pad the input with a semaphore release so that we know when the execution of the provided
+    // input is complete.
+    val paddedInput = input + '\n' + "semaphore.release()\n"
+    testSuiteOut.write(paddedInput.getBytes)
+    testSuiteOut.flush()
+    if (!semaphore.tryAcquire(TIMEOUT_SECONDS, TimeUnit.SECONDS)) {
+      val failOut = getCleanString(ammoniteOut)
+      val errOut = getCleanString(errorStream)
+      val errorString =
+        s"""
+          |REPL Timed out while running command: $input
+          |Console output: $failOut
+          |Error output: $errOut
+          |""".stripMargin
+      throw new RuntimeException(errorString)
+    }
+    getCleanString(ammoniteOut)
+  }
+
+  def assertContains(message: String, output: String): Unit = {
+    val isContain = output.contains(message)
+    assert(isContain, "Ammonite output did not contain '" + message + "':\n" + output)
+  }
+
+  test("Simple query") {

Review Comment:
   Looking. The output looks a bit weird - The first tests fails due to a REPL timeout but the error logs indicate it is just setting up (albeit slowly). The first 'false' failure seems to cause a chain reaction of failures for the other tests because they're all looking at the shell output earlier then they should because of the semaphore permit left un-acquired from the first test.
   
   An increase in the repl shell timeout should fix this. Verifying



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1158710018


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -488,6 +488,14 @@ class SparkSession private[sql] (
   @scala.annotation.varargs
   def addArtifacts(uri: URI*): Unit = client.addArtifacts(uri)
 
+  /**
+   * Register a [[ClassFinder]] for dynamically generated classes.
+   *
+   * @since 3.4.0
+   */
+  @Experimental

Review Comment:
   There is part of me that thinks we should be doing this as part of building the session, instead of adding it while running.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] grundprinzip commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "grundprinzip (via GitHub)" <gi...@apache.org>.
grundprinzip commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1166738100


##########
.github/workflows/build_and_test.yml:
##########
@@ -247,7 +247,10 @@ jobs:
     # Run the tests.
     - name: Run tests
       env: ${{ fromJSON(inputs.envs) }}
+      shell: 'script -q -e -c "bash {0}"'
       run: |
+        # Fix for TTY related issues when laaunching Ammonite REPL in tests.

Review Comment:
   Typo



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "amaliujia (via GitHub)" <gi...@apache.org>.
amaliujia commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1169085994


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/connect/client/SparkConnectClient.scala:
##########
@@ -58,10 +58,13 @@ private[sql] class SparkConnectClient(
    * @return
    *   A [[proto.AnalyzePlanResponse]] from the Spark Connect server.
    */
-  def analyze(request: proto.AnalyzePlanRequest): proto.AnalyzePlanResponse =
+  def analyze(request: proto.AnalyzePlanRequest): proto.AnalyzePlanResponse = {
+    artifactManager.uploadAllClassFileArtifacts()

Review Comment:
   Does this method cache previous scanned artifacts? If not maybe the overhead of calling analyze and execute becomes a bit large?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515644305

   > Hmm... GA should merge the current pr to the latest master before testing, right?
   
   Yes except the workflow file. For the changes in workflow, they have to manually update to the latest master branch.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515617336

   Hmm... GA should merge the current pr to the latest master before testing, right? 
   
   It's okay, Let's wait and see if there are any new cases happening.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1517995794

   @vicennial I found `ReplE2ESuite` always failed in Java 17 GA daily test:
   
   - https://github.com/apache/spark/actions/runs/4726264540/jobs/8385681548
   - https://github.com/apache/spark/actions/runs/4737365554/jobs/8410097712
   - https://github.com/apache/spark/actions/runs/4748319019/jobs/8434392414
   - https://github.com/apache/spark/actions/runs/4759278349/jobs/8458399201
   
   <img width="1307" alt="image" src="https://user-images.githubusercontent.com/1475305/233674106-5cf0c4cf-ed4f-4d75-be42-3b7c39dc2936.png">
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1158709117


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -488,6 +488,14 @@ class SparkSession private[sql] (
   @scala.annotation.varargs
   def addArtifacts(uri: URI*): Unit = client.addArtifacts(uri)
 
+  /**
+   * Register a [[ClassFinder]] for dynamically generated classes.
+   *
+   * @since 3.4.0
+   */
+  @Experimental

Review Comment:
   Also mark @Private?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1158721057


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/SimpleSparkConnectService.scala:
##########
@@ -38,7 +38,7 @@ private[sql] object SimpleSparkConnectService {
   private val stopCommand = "q"
 
   def main(args: Array[String]): Unit = {
-    val conf = new SparkConf()
+    val conf = new SparkConf().set("connect.test", "true")

Review Comment:
   Can't we just configure the plugin here? Instead of adding a new conf?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1159372877


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########


Review Comment:
   The `ReplE2ESuite` extends this suite and the port value is needed to passthrough to Ammonite



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhenlineo commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "zhenlineo (via GitHub)" <gi...@apache.org>.
zhenlineo commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1163214597


##########
resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile:
##########
@@ -60,3 +60,4 @@ ENTRYPOINT [ "/opt/entrypoint.sh" ]
 
 # Specify the User that the actual main process will run as
 USER ${spark_uid}
+RUN export TERM=vt100 && script -qfc 'echo exit | amm -s' && rm typescript

Review Comment:
   Maybe this file instead? https://github.com/apache/spark/blob/master/dev/infra/Dockerfile
   
   I have a test build to save sometime for you -> https://github.com/apache/spark/pull/40745



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {

Review Comment:
   Can you add some doc about if there is anything extra to install to run this test? Two tests failed on my local machine with command:
   `sbt "testOnly org.apache.spark.sql.application.ReplE2ESuite"`
   
   Did I miss something?



##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/SimpleSparkConnectService.scala:
##########
@@ -38,7 +38,7 @@ private[sql] object SimpleSparkConnectService {
   private val stopCommand = "q"
 
   def main(args: Array[String]): Unit = {
-    val conf = new SparkConf()
+    val conf = new SparkConf().set("connect.test", "true")

Review Comment:
   Did you drop the line for `server.start` and `server.stop` in this file? 
   
   When you set the plugin, then a connect server will be started and stoped within the spark context. So there is no need to start an extra one in this tiny script.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] pan3793 commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "pan3793 (via GitHub)" <gi...@apache.org>.
pan3793 commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1168716537


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -488,6 +488,14 @@ class SparkSession private[sql] (
   @scala.annotation.varargs
   def addArtifacts(uri: URI*): Unit = client.addArtifacts(uri)
 
+  /**
+   * Register a [[ClassFinder]] for dynamically generated classes.
+   *
+   * @since 3.4.0

Review Comment:
   should be 3.5.0



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1513102087

   @LuciferYang It looks related to the TTY issues (https://github.com/com-lihaoyi/Ammonite/issues/276) we were hitting in the CI pipelines earlier. (`2023-04-18T03:29:24.5685770Z sh: 1: cannot open /dev/tty: No such device or address` in the logs is the indicator) 
   
   [These](https://github.com/apache/spark/pull/40675/files#diff-48c0ee97c53013d18d6bbae44648f7fab9af2e0bf5b0dc1ca761e18ec5c478f2R250-R253) changes (taken from [here](https://github.com/com-lihaoyi/Ammonite/issues/276#issuecomment-439273906)) (helped us mitigate the issue for the CI pipelines but perhaps the GA test pipeline uses a different config? Any idea about the pipeline config @HyukjinKwon?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "hvanhovell (via GitHub)" <gi...@apache.org>.
hvanhovell commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1511331257

   Merging


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40675:
URL: https://github.com/apache/spark/pull/40675#issuecomment-1515586188

   Ah, i think it just happened because of unsynced fork. It should be fixed if their fork is synced to the lastest master branch.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1164191842


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {

Review Comment:
   That's odd. What were the test failures? 
   Since it an E2E test, both the connect and server jars would need to be packaged before the tests. Could you try running the test after doing a `package`? 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1174917132


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,128 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+  }
+
+  override def afterAll(): Unit = {
+    executorService.shutdownNow()
+    super.afterAll()
+  }
+
+  def runCommandsInShell(input: String): String = {
+    require(input.nonEmpty)
+    // Pad the input with a semaphore release so that we know when the execution of the provided
+    // input is complete.
+    val paddedInput = input + '\n' + "semaphore.release()\n"
+    testSuiteOut.write(paddedInput.getBytes)
+    testSuiteOut.flush()
+    if (!semaphore.tryAcquire(TIMEOUT_SECONDS, TimeUnit.SECONDS)) {
+      val failOut = getCleanString(ammoniteOut)
+      val errOut = getCleanString(errorStream)
+      val errorString =
+        s"""
+          |REPL Timed out while running command: $input
+          |Console output: $failOut
+          |Error output: $errOut
+          |""".stripMargin
+      throw new RuntimeException(errorString)
+    }
+    getCleanString(ammoniteOut)
+  }
+
+  def assertContains(message: String, output: String): Unit = {
+    val isContain = output.contains(message)
+    assert(isContain, "Ammonite output did not contain '" + message + "':\n" + output)
+  }
+
+  test("Simple query") {

Review Comment:
   Looking. The output looks a bit weird - The first tests fails due to a REPL timeout but the error logs indicate it is just setting up (albeit slowly). The first 'false' failure seems to cause a chain reaction of failures for the other tests because they're all looking at the shell output earlier then they should because of the semaphore permit left un-acquired from the first test.
   
   An increase in the repl shell timeout should fix this (+ draining permits after every test). Verifying



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1178203553


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,128 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+  }
+
+  override def afterAll(): Unit = {
+    executorService.shutdownNow()
+    super.afterAll()
+  }
+
+  def runCommandsInShell(input: String): String = {
+    require(input.nonEmpty)
+    // Pad the input with a semaphore release so that we know when the execution of the provided
+    // input is complete.
+    val paddedInput = input + '\n' + "semaphore.release()\n"
+    testSuiteOut.write(paddedInput.getBytes)
+    testSuiteOut.flush()
+    if (!semaphore.tryAcquire(TIMEOUT_SECONDS, TimeUnit.SECONDS)) {
+      val failOut = getCleanString(ammoniteOut)
+      val errOut = getCleanString(errorStream)
+      val errorString =
+        s"""
+          |REPL Timed out while running command: $input
+          |Console output: $failOut
+          |Error output: $errOut
+          |""".stripMargin
+      throw new RuntimeException(errorString)
+    }
+    getCleanString(ammoniteOut)
+  }
+
+  def assertContains(message: String, output: String): Unit = {
+    val isContain = output.contains(message)
+    assert(isContain, "Ammonite output did not contain '" + message + "':\n" + output)
+  }
+
+  test("Simple query") {

Review Comment:
   Thx!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] vicennial commented on a diff in pull request #40675: [SPARK-42657][CONNECT] Support to find and transfer client-side REPL classfiles to server as artifacts

Posted by "vicennial (via GitHub)" <gi...@apache.org>.
vicennial commented on code in PR #40675:
URL: https://github.com/apache/spark/pull/40675#discussion_r1176757185


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/application/ReplE2ESuite.scala:
##########
@@ -0,0 +1,128 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql.application
+
+import java.io.{PipedInputStream, PipedOutputStream}
+import java.util.concurrent.{Executors, Semaphore, TimeUnit}
+
+import org.apache.commons.io.output.ByteArrayOutputStream
+
+import org.apache.spark.sql.connect.client.util.RemoteSparkSession
+
+class ReplE2ESuite extends RemoteSparkSession {
+
+  private val executorService = Executors.newSingleThreadExecutor()
+  private val TIMEOUT_SECONDS = 10
+
+  private var testSuiteOut: PipedOutputStream = _
+  private var ammoniteOut: ByteArrayOutputStream = _
+  private var errorStream: ByteArrayOutputStream = _
+  private var ammoniteIn: PipedInputStream = _
+  private val semaphore: Semaphore = new Semaphore(0)
+
+  private def getCleanString(out: ByteArrayOutputStream): String = {
+    // Remove ANSI colour codes
+    // Regex taken from https://stackoverflow.com/a/25189932
+    out.toString("UTF-8").replaceAll("\u001B\\[[\\d;]*[^\\d;]", "")
+  }
+
+  override def beforeAll(): Unit = {
+    super.beforeAll()
+    ammoniteOut = new ByteArrayOutputStream()
+    testSuiteOut = new PipedOutputStream()
+    // Connect the `testSuiteOut` and `ammoniteIn` pipes
+    ammoniteIn = new PipedInputStream(testSuiteOut)
+    errorStream = new ByteArrayOutputStream()
+
+    val args = Array("--port", serverPort.toString)
+    val task = new Runnable {
+      override def run(): Unit = {
+        ConnectRepl.doMain(
+          args = args,
+          semaphore = Some(semaphore),
+          inputStream = ammoniteIn,
+          outputStream = ammoniteOut,
+          errorStream = errorStream)
+      }
+    }
+
+    executorService.submit(task)
+  }
+
+  override def afterAll(): Unit = {
+    executorService.shutdownNow()
+    super.afterAll()
+  }
+
+  def runCommandsInShell(input: String): String = {
+    require(input.nonEmpty)
+    // Pad the input with a semaphore release so that we know when the execution of the provided
+    // input is complete.
+    val paddedInput = input + '\n' + "semaphore.release()\n"
+    testSuiteOut.write(paddedInput.getBytes)
+    testSuiteOut.flush()
+    if (!semaphore.tryAcquire(TIMEOUT_SECONDS, TimeUnit.SECONDS)) {
+      val failOut = getCleanString(ammoniteOut)
+      val errOut = getCleanString(errorStream)
+      val errorString =
+        s"""
+          |REPL Timed out while running command: $input
+          |Console output: $failOut
+          |Error output: $errOut
+          |""".stripMargin
+      throw new RuntimeException(errorString)
+    }
+    getCleanString(ammoniteOut)
+  }
+
+  def assertContains(message: String, output: String): Unit = {
+    val isContain = output.contains(message)
+    assert(isContain, "Ammonite output did not contain '" + message + "':\n" + output)
+  }
+
+  test("Simple query") {

Review Comment:
   @HyukjinKwon Verified that the fix works locally, ptal at PR: https://github.com/apache/spark/pull/40948



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org