You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/09/19 18:59:21 UTC

[jira] [Commented] (SPARK-17596) Streaming job lacks Scala runtime methods

    [ https://issues.apache.org/jira/browse/SPARK-17596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15504337#comment-15504337 ] 

Sean Owen commented on SPARK-17596:
-----------------------------------

This sounds like a Scala version mismatch problem, or a packaging problem. Verify you have the same Scala version everywhere and aren't packaging Scala with your app.

> Streaming job lacks Scala runtime methods
> -----------------------------------------
>
>                 Key: SPARK-17596
>                 URL: https://issues.apache.org/jira/browse/SPARK-17596
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 2.0.0
>         Environment: Linux 4.4.20 x86_64 GNU/Linux
> openjdk version "1.8.0_102"
> Scala 2.11.8
>            Reporter: Evgeniy Tsvigun
>              Labels: kafka-0.8, streaming
>
> When using -> in Spark Streaming 2.0.0 jobs, or using spark-streaming-kafka-0-8_2.11 v2.0.0, and submitting it with spark-submit, I get the following error:
>     Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 72.0 failed 1 times, most recent failure: Lost task 0.0 in stage 72.0 (TID 37, localhost): java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
> This only happens with spark-streaming, using ArrowAssoc in plain non-streaming Spark jobs works fine.
> I put a brief illustration of this phenomenon to a GitHub repo: https://github.com/utgarda/spark-2-streaming-nosuchmethod-arrowassoc
> Putting only provided dependencies to build.sbt
> "org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
> "org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided"
> using -> anywhere in the driver code, packing it with sbt-assembly and submitting the job results in an error. This isn't a big problem by itself, using ArrayAssoc can be avoided, but spark-streaming-kafka-0-8_2.11 v2.0.0 has it somewhere inside, and generates the same error.
> Packing with scala-library, can see the class in the jar after packing, but it's still reported missing in runtime.
> The issue reported on StackOverflow: http://stackoverflow.com/questions/39395521/spark-2-0-0-streaming-job-packed-with-sbt-assembly-lacks-scala-runtime-methods



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org