You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Fabrizio Fab <fa...@tiscali.it> on 2021/10/28 11:43:33 UTC

Scala 2.12 version mismatch for Spark Interpreter

I am aware that Spark 3.20 is not officially released, but I am trying to put it to work.

The first thing that I noticed is the following:

the SparkInterpreter is compiled for Scala 2.12.7

Spark 3.2 is compiled for Scala 2.12.15

Unfortunately there are some breaking changes between the two versions (even if only the minor version has changed... W.T.F. ??)  that requires a recompiling (I hope no code update)..

The first incompatibily I run into is at line 66 of SparkScala212Interpreter.scala
    val settings = new Settings()
    settings.processArguments(List("-Yrepl-class-based",
      "-Yrepl-outdir", s"${outputDir.getAbsolutePath}"), true)
    settings.embeddedDefaults(sparkInterpreterClassLoader)

-->    settings.usejavacp.value = true  <--

scala.tools.nsc.Settings.usejavacp was moved since 2.12.13 from AbsSettings to MutableSettings, so you  get a runtime error.


I'll make you know if I'll resolve all problems.



Re: Scala 2.12 version mismatch for Spark Interpreter

Posted by Mich Talebzadeh <mi...@gmail.com>.
apologies should say the docker image should be on 3.1.1



   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Thu, 28 Oct 2021 at 14:34, Mich Talebzadeh <mi...@gmail.com>
wrote:

> you should go for Spark 3.1.1 for k8s. That is the tried and tested one
> for Kubernetes in Spark 3 series, meaning the docker image should be on
> .1.1 and your client which I think is used to submit spark-submit on k8s
> should also be on 3.1.1
>
> HTH
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Thu, 28 Oct 2021 at 13:13, Jeff Zhang <zj...@gmail.com> wrote:
>
>> Hi Fabrizio,
>>
>> Spark 3.2.0 is supported recently in this PR
>> https://github.com/apache/zeppelin/pull/4257
>> The problem you mentioned is solved.
>>
>> Fabrizio Fab <fa...@tiscali.it> 于2021年10月28日周四 下午7:43写道:
>>
>>> I am aware that Spark 3.20 is not officially released, but I am trying
>>> to put it to work.
>>>
>>> The first thing that I noticed is the following:
>>>
>>> the SparkInterpreter is compiled for Scala 2.12.7
>>>
>>> Spark 3.2 is compiled for Scala 2.12.15
>>>
>>> Unfortunately there are some breaking changes between the two versions
>>> (even if only the minor version has changed... W.T.F. ??)  that requires a
>>> recompiling (I hope no code update)..
>>>
>>> The first incompatibily I run into is at line 66 of
>>> SparkScala212Interpreter.scala
>>>     val settings = new Settings()
>>>     settings.processArguments(List("-Yrepl-class-based",
>>>       "-Yrepl-outdir", s"${outputDir.getAbsolutePath}"), true)
>>>     settings.embeddedDefaults(sparkInterpreterClassLoader)
>>>
>>> -->    settings.usejavacp.value = true  <--
>>>
>>> scala.tools.nsc.Settings.usejavacp was moved since 2.12.13 from
>>> AbsSettings to MutableSettings, so you  get a runtime error.
>>>
>>>
>>> I'll make you know if I'll resolve all problems.
>>>
>>>
>>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

Re: Scala 2.12 version mismatch for Spark Interpreter

Posted by Mich Talebzadeh <mi...@gmail.com>.
you should go for Spark 3.1.1 for k8s. That is the tried and tested one for
Kubernetes in Spark 3 series, meaning the docker image should be on .1.1
and your client which I think is used to submit spark-submit on k8s should
also be on 3.1.1

HTH


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Thu, 28 Oct 2021 at 13:13, Jeff Zhang <zj...@gmail.com> wrote:

> Hi Fabrizio,
>
> Spark 3.2.0 is supported recently in this PR
> https://github.com/apache/zeppelin/pull/4257
> The problem you mentioned is solved.
>
> Fabrizio Fab <fa...@tiscali.it> 于2021年10月28日周四 下午7:43写道:
>
>> I am aware that Spark 3.20 is not officially released, but I am trying to
>> put it to work.
>>
>> The first thing that I noticed is the following:
>>
>> the SparkInterpreter is compiled for Scala 2.12.7
>>
>> Spark 3.2 is compiled for Scala 2.12.15
>>
>> Unfortunately there are some breaking changes between the two versions
>> (even if only the minor version has changed... W.T.F. ??)  that requires a
>> recompiling (I hope no code update)..
>>
>> The first incompatibily I run into is at line 66 of
>> SparkScala212Interpreter.scala
>>     val settings = new Settings()
>>     settings.processArguments(List("-Yrepl-class-based",
>>       "-Yrepl-outdir", s"${outputDir.getAbsolutePath}"), true)
>>     settings.embeddedDefaults(sparkInterpreterClassLoader)
>>
>> -->    settings.usejavacp.value = true  <--
>>
>> scala.tools.nsc.Settings.usejavacp was moved since 2.12.13 from
>> AbsSettings to MutableSettings, so you  get a runtime error.
>>
>>
>> I'll make you know if I'll resolve all problems.
>>
>>
>>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: Scala 2.12 version mismatch for Spark Interpreter

Posted by Jeff Zhang <zj...@gmail.com>.
Hi Fabrizio,

Spark 3.2.0 is supported recently in this PR
https://github.com/apache/zeppelin/pull/4257
The problem you mentioned is solved.

Fabrizio Fab <fa...@tiscali.it> 于2021年10月28日周四 下午7:43写道:

> I am aware that Spark 3.20 is not officially released, but I am trying to
> put it to work.
>
> The first thing that I noticed is the following:
>
> the SparkInterpreter is compiled for Scala 2.12.7
>
> Spark 3.2 is compiled for Scala 2.12.15
>
> Unfortunately there are some breaking changes between the two versions
> (even if only the minor version has changed... W.T.F. ??)  that requires a
> recompiling (I hope no code update)..
>
> The first incompatibily I run into is at line 66 of
> SparkScala212Interpreter.scala
>     val settings = new Settings()
>     settings.processArguments(List("-Yrepl-class-based",
>       "-Yrepl-outdir", s"${outputDir.getAbsolutePath}"), true)
>     settings.embeddedDefaults(sparkInterpreterClassLoader)
>
> -->    settings.usejavacp.value = true  <--
>
> scala.tools.nsc.Settings.usejavacp was moved since 2.12.13 from
> AbsSettings to MutableSettings, so you  get a runtime error.
>
>
> I'll make you know if I'll resolve all problems.
>
>
>

-- 
Best Regards

Jeff Zhang