You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/03/02 00:16:35 UTC
[spark] branch branch-3.4 updated: [SPARK-42632][CONNECT] Fix scala paths in integration tests
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new e194260c368 [SPARK-42632][CONNECT] Fix scala paths in integration tests
e194260c368 is described below
commit e194260c368e4e84fe468de2be4fc5a9ec2e850d
Author: Herman van Hovell <he...@databricks.com>
AuthorDate: Thu Mar 2 09:16:10 2023 +0900
[SPARK-42632][CONNECT] Fix scala paths in integration tests
### What changes were proposed in this pull request?
We use the current scala version to figure out which jar to load.
### Why are the changes needed?
The jar resolution in the connect client tests can resolve the jar for the wrong scala version if you are working with multiple scala versions.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
It is a test.
Closes #40235 from hvanhovell/SPARK-42632.
Authored-by: Herman van Hovell <he...@databricks.com>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit 1a457f9ed14810667b611155b586ebda5a95fece)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
.../spark/sql/connect/client/util/IntegrationTestUtils.scala | 12 +++++++++++-
1 file changed, 11 insertions(+), 1 deletion(-)
diff --git a/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/IntegrationTestUtils.scala b/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/IntegrationTestUtils.scala
index 6c465c83b08..f27ea614a7e 100644
--- a/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/IntegrationTestUtils.scala
+++ b/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/IntegrationTestUtils.scala
@@ -18,6 +18,8 @@ package org.apache.spark.sql.connect.client.util
import java.io.File
+import scala.util.Properties.versionNumberString
+
import org.scalatest.Assertions.fail
object IntegrationTestUtils {
@@ -25,6 +27,14 @@ object IntegrationTestUtils {
// System properties used for testing and debugging
private val DEBUG_SC_JVM_CLIENT = "spark.debug.sc.jvm.client"
+ private[sql] lazy val scalaDir = {
+ val version = versionNumberString.split('.') match {
+ case Array(major, minor, _*) => major + "." + minor
+ case _ => versionNumberString
+ }
+ "scala-" + version
+ }
+
private[sql] lazy val sparkHome: String = {
if (!(sys.props.contains("spark.test.home") || sys.env.contains("SPARK_HOME"))) {
fail("spark.test.home or SPARK_HOME is not set.")
@@ -57,7 +67,7 @@ object IntegrationTestUtils {
"and the env variable `SPARK_HOME` is set correctly.")
val jars = recursiveListFiles(targetDir).filter { f =>
// SBT jar
- (f.getParentFile.getName.startsWith("scala-") &&
+ (f.getParentFile.getName == scalaDir &&
f.getName.startsWith(sbtName) && f.getName.endsWith(".jar")) ||
// Maven Jar
(f.getParent.endsWith("target") &&
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org