You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "zhengruifeng (via GitHub)" <gi...@apache.org> on 2023/11/09 18:56:14 UTC

[PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

zhengruifeng opened a new pull request, #43739:
URL: https://github.com/apache/spark/pull/43739

   
   ### What changes were proposed in this pull request?
   Since JVM runs static codes only once, if loading functions$ fails, it will always report java.lang.NoClassDefFoundError: Could not initialize class
   ```
   23/11/01 23:06:21 WARN TaskSetManager: Lost task 136.0 in stage 9565.0 (TID 4557384) (10.4.35.209 executor 16): TaskKilled (Stage cancelled: Job aborted due to stage failure: Task 2 in stage 9565.0 failed 4 times, most recent failure: Lost task 2.3 in stage 9565.0 (TID 4558369) (10.4.56.6 executor 71): java.io.IOException: unexpected exception type
   	at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1750)
   	at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1280)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2222)
   	…
   	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:900)
   	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   	at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:795)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at java.lang.Thread.run(Thread.java:750)
   Caused by: java.lang.reflect.InvocationTargetException
   	at sun.reflect.GeneratedMethodAccessor520.invoke(Unknown Source)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230)
   	at sun.reflect.GeneratedMethodAccessor224.invoke(Unknown Source)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1274)
   	... 388 more
   Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.ml.functions$
   	... 396 more
   ```
   
   This PR just changes `functions.*` as lazy avoid hitting this issue because the initialization codes of a lazy val is not in static codes.
   
   
   ### Why are the changes needed?
   to fix a intermittent bug
   
   
   ### Does this PR introduce _any_ user-facing change?
   no
   
   
   ### How was this patch tested?
   manually checked
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   no
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on PR #43739:
URL: https://github.com/apache/spark/pull/43739#issuecomment-1807385986

   @WeichenXu123 would you mind taking another look?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on code in PR #43739:
URL: https://github.com/apache/spark/pull/43739#discussion_r1389015793


##########
mllib/src/test/scala/org/apache/spark/ml/FunctionsSuite.scala:
##########
@@ -101,4 +102,46 @@ class FunctionsSuite extends MLTest {
     val resultVec3 = df3.select(array_to_vector(col("c1"))).collect()(0)(0).asInstanceOf[Vector]
     assert(resultVec3 === Vectors.dense(Array(1.0, 2.0)))
   }
+
+  test("SPARK-45859: 'functions$' should not be affected by a broken class loader") {
+    quietly {
+      // Only one SparkContext should be running in this JVM (see SPARK-2243)
+      sc.stop()
+
+      val conf = new SparkConf()
+        .setAppName("FunctionsSuite")
+        .setMaster("local-cluster[1,1,1024]")
+      sc = new SparkContext(conf)
+      // Make `functions$` be loaded by a broken class loader
+      intercept[SparkException] {
+        sc.parallelize(1 to 1).foreach { _ =>
+          val originalClassLoader = Thread.currentThread.getContextClassLoader
+          try {
+            Thread.currentThread.setContextClassLoader(new BrokenClassLoader)
+            vector_to_array(col("vector"))
+            array_to_vector(col("array"))
+          } finally {
+            Thread.currentThread.setContextClassLoader(originalClassLoader)
+          }
+        }
+      }
+
+      // We should be able to use `functions$` even it was loaded by a broken class loader
+      sc.parallelize(1 to 1).foreach { _ =>
+        vector_to_array(col("vector"))
+        array_to_vector(col("array"))
+      }
+
+      // this UT should be the last one in this test suite, since it uses
+      // a different `sc` from the standard one.
+      // stop it here in case new UTs are added after this one.

Review Comment:
   I think it's better to create a separate suite, and document this on the top to avoid such mistakes.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon closed pull request #43739: [SPARK-45859][ML] Make UDF objects in ml.functions lazy
URL: https://github.com/apache/spark/pull/43739


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "zsxwing (via GitHub)" <gi...@apache.org>.
zsxwing commented on PR #43739:
URL: https://github.com/apache/spark/pull/43739#issuecomment-1804413748

   Could you add a unit test for this change?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "WeichenXu123 (via GitHub)" <gi...@apache.org>.
WeichenXu123 commented on code in PR #43739:
URL: https://github.com/apache/spark/pull/43739#discussion_r1388617978


##########
mllib/src/test/scala/org/apache/spark/ml/FunctionsSuite.scala:
##########
@@ -101,4 +102,41 @@ class FunctionsSuite extends MLTest {
     val resultVec3 = df3.select(array_to_vector(col("c1"))).collect()(0)(0).asInstanceOf[Vector]
     assert(resultVec3 === Vectors.dense(Array(1.0, 2.0)))
   }
+
+  test("SPARK-45859: 'functions$' should not be affected by a broken class loader") {
+    quietly {
+      // Only one SparkContext should be running in this JVM (see SPARK-2243)
+      sc.stop()

Review Comment:
   shouldn't we add this after each test run ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #43739:
URL: https://github.com/apache/spark/pull/43739#issuecomment-1807703285

   Merged to master.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #43739:
URL: https://github.com/apache/spark/pull/43739#discussion_r1389712978


##########
mllib/src/test/scala/org/apache/spark/ml/FunctionsSuite.scala:
##########
@@ -101,4 +102,46 @@ class FunctionsSuite extends MLTest {
     val resultVec3 = df3.select(array_to_vector(col("c1"))).collect()(0)(0).asInstanceOf[Vector]
     assert(resultVec3 === Vectors.dense(Array(1.0, 2.0)))
   }
+
+  test("SPARK-45859: 'functions$' should not be affected by a broken class loader") {
+    quietly {
+      // Only one SparkContext should be running in this JVM (see SPARK-2243)
+      sc.stop()
+
+      val conf = new SparkConf()
+        .setAppName("FunctionsSuite")
+        .setMaster("local-cluster[1,1,1024]")
+      sc = new SparkContext(conf)
+      // Make `functions$` be loaded by a broken class loader
+      intercept[SparkException] {
+        sc.parallelize(1 to 1).foreach { _ =>
+          val originalClassLoader = Thread.currentThread.getContextClassLoader
+          try {
+            Thread.currentThread.setContextClassLoader(new BrokenClassLoader)
+            vector_to_array(col("vector"))
+            array_to_vector(col("array"))
+          } finally {
+            Thread.currentThread.setContextClassLoader(originalClassLoader)
+          }
+        }
+      }
+
+      // We should be able to use `functions$` even it was loaded by a broken class loader
+      sc.parallelize(1 to 1).foreach { _ =>
+        vector_to_array(col("vector"))
+        array_to_vector(col("array"))
+      }
+
+      // this UT should be the last one in this test suite, since it uses
+      // a different `sc` from the standard one.
+      // stop it here in case new UTs are added after this one.

Review Comment:
   sounds good, let me move it to a separate file



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #43739:
URL: https://github.com/apache/spark/pull/43739#discussion_r1389946041


##########
mllib/src/test/scala/org/apache/spark/ml/FunctionsSuite.scala:
##########
@@ -101,4 +102,46 @@ class FunctionsSuite extends MLTest {
     val resultVec3 = df3.select(array_to_vector(col("c1"))).collect()(0)(0).asInstanceOf[Vector]
     assert(resultVec3 === Vectors.dense(Array(1.0, 2.0)))
   }
+
+  test("SPARK-45859: 'functions$' should not be affected by a broken class loader") {
+    quietly {
+      // Only one SparkContext should be running in this JVM (see SPARK-2243)
+      sc.stop()
+
+      val conf = new SparkConf()
+        .setAppName("FunctionsSuite")
+        .setMaster("local-cluster[1,1,1024]")
+      sc = new SparkContext(conf)
+      // Make `functions$` be loaded by a broken class loader
+      intercept[SparkException] {
+        sc.parallelize(1 to 1).foreach { _ =>
+          val originalClassLoader = Thread.currentThread.getContextClassLoader
+          try {
+            Thread.currentThread.setContextClassLoader(new BrokenClassLoader)
+            vector_to_array(col("vector"))
+            array_to_vector(col("array"))
+          } finally {
+            Thread.currentThread.setContextClassLoader(originalClassLoader)
+          }
+        }
+      }
+
+      // We should be able to use `functions$` even it was loaded by a broken class loader
+      sc.parallelize(1 to 1).foreach { _ =>
+        vector_to_array(col("vector"))
+        array_to_vector(col("array"))
+      }
+
+      // this UT should be the last one in this test suite, since it uses
+      // a different `sc` from the standard one.
+      // stop it here in case new UTs are added after this one.

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45859][ML] Make UDF objects in ml.functions lazy [spark]

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #43739:
URL: https://github.com/apache/spark/pull/43739#discussion_r1388656184


##########
mllib/src/test/scala/org/apache/spark/ml/FunctionsSuite.scala:
##########
@@ -101,4 +102,41 @@ class FunctionsSuite extends MLTest {
     val resultVec3 = df3.select(array_to_vector(col("c1"))).collect()(0)(0).asInstanceOf[Vector]
     assert(resultVec3 === Vectors.dense(Array(1.0, 2.0)))
   }
+
+  test("SPARK-45859: 'functions$' should not be affected by a broken class loader") {
+    quietly {
+      // Only one SparkContext should be running in this JVM (see SPARK-2243)
+      sc.stop()

Review Comment:
   good point, we'd better close `sc` in this UT, in case contributors add new UTs after this one and tested with non-expected `sc`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org