You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/02/10 03:49:00 UTC

[jira] [Commented] (SPARK-38165) private classes fail at runtime in scala 2.12.13+

    [ https://issues.apache.org/jira/browse/SPARK-38165?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17489945#comment-17489945 ] 

Hyukjin Kwon commented on SPARK-38165:
--------------------------------------

Interesting. Spark already uses Scala 2.12.15 even in Spark 3.2.1.

> private classes fail at runtime in scala 2.12.13+
> -------------------------------------------------
>
>                 Key: SPARK-38165
>                 URL: https://issues.apache.org/jira/browse/SPARK-38165
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.1
>         Environment: Tested in using JVM 8, 11 on scala versions 2.12.12 (works), 12.12.13 to 12.12.15 and 12.13.7 to 12.13.8
>            Reporter: Johnny Everson
>            Priority: Major
>
> h2. reproduction steps
> {code:java}
> git clone git@github.com:everson/spark-codegen-bug.git
> sbt +test
> {code}
> h2. problem
> Starting with Scala 2.12.13, Spark code (tried 3.1.x and 3.2.x versions) referring to case classes members fail at runtime.
> See discussion on [https://github.com/scala/bug/issues/12533] for exact internal details from scala contributors, but the gist that starting with Scala 2.12.13, inner classes visibility rules changed via https://github.com/scala/scala/pull/9131 and it appears that Spark CodeGen assumes they are public.
> In a complex project, the error looks like:
> {code:java}
> [error]    Success(SparkFailures(NonEmpty[Unknown(org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 2.0 failed 1 times, most recent failure: Lost task 1.0 in stage 2.0 (TID 3) (192.168.0.80 executor driver): java.util.concurrent.ExecutionException: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: Private member cannot be accessed from type "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection".
> [error]    	at org.sparkproject.guava.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:306)
> [error]    	at org.sparkproject.guava.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:293)
> [error]    	at org.sparkproject.guava.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
> [error]    	at org.sparkproject.guava.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
> [error]    	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.waitForValue(LocalCache.java:3620)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.waitForLoadingValue(LocalCache.java:2362)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2349)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> [error]    	at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000)
> [error]    	at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> [error]    	at org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1351)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1277)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1274)
> [error]    	at org.apache.spark.sql.execution.ObjectOperator$.deserializeRowToObject(objects.scala:147)
> [error]    	at org.apache.spark.sql.execution.AppendColumnsExec.$anonfun$doExecute$12(objects.scala:326)
> [error]    	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:898)
> [error]    	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:898)
> [error]    	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> [error]    	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
> [error]    	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
> [error]    	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> [error]    	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
> [error]    	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
> [error]    	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> [error]    	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
> [error]    	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
> [error]    	at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
> [error]    	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
> [error]    	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
> [error]    	at org.apache.spark.scheduler.Task.run(Task.scala:131)
> [error]    	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
> [error]    	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
> [error]    	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
> [error]    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> [error]    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> [error]    	at java.lang.Thread.run(Thread.java:748)
> [error]    Caused by: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: Private member cannot be accessed from type "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection".
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1415)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1500)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1497)
> [error]    	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> [error]    	... 32 more
> [error]
> [error]    Driver stacktrace:)])) is a Success but SparkFailures(NonEmpty[Unknown(org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 2.0 failed 1 times, most recent failure: Lost task 1.0 in stage 2.0 (TID 3) (192.168.0.80 executor driver): java.util.concurrent.ExecutionException: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: Private member cannot be accessed from type "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection".
> [error]    	at org.sparkproject.guava.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:306)
> [error]    	at org.sparkproject.guava.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:293)
> [error]    	at org.sparkproject.guava.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
> [error]    	at org.sparkproject.guava.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
> [error]    	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.waitForValue(LocalCache.java:3620)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.waitForLoadingValue(LocalCache.java:2362)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2349)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> [error]    	at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000)
> [error]    	at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> [error]    	at org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1351)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1277)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1274)
> [error]    	at org.apache.spark.sql.execution.ObjectOperator$.deserializeRowToObject(objects.scala:147)
> [error]    	at org.apache.spark.sql.execution.AppendColumnsExec.$anonfun$doExecute$12(objects.scala:326)
> [error]    	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:898)
> [error]    	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:898)
> [error]    	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> [error]    	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
> [error]    	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
> [error]    	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> [error]    	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
> [error]    	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
> [error]    	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> [error]    	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
> [error]    	at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
> [error]    	at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
> [error]    	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
> [error]    	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
> [error]    	at org.apache.spark.scheduler.Task.run(Task.scala:131)
> [error]    	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
> [error]    	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
> [error]    	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
> [error]    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> [error]    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> [error]    	at java.lang.Thread.run(Thread.java:748)
> [error]    Caused by: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 63, Column 8: Private member cannot be accessed from type "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection".
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1415)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1500)
> [error]    	at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1497)
> [error]    	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> [error]    	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> [error]    	... 32 more
> [error]
> [error]    Driver stacktrace:)]) != Success (Expectations.scala:57)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org