You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@beam.apache.org by Zheng Ni <ni...@gmail.com> on 2022/01/05 06:50:49 UTC

Compatibility of Spark portable runner

Hi Beam Community,



Greetings.



I am interested in submitting a spark job through portable runner. I have a
question about the compatibility between spark_job_server and spark cluster.



Let’s say I am going to use beam_spark_job_server of version 2.35.0. How
could I know which spark cluster version is compatible with it? Or could it
work with any versions of the spark cluster?





Regards,

Zheng

Re: Compatibility of Spark portable runner

Posted by Kyle Weaver <kc...@google.com>.
Specifically, we publish different builds of the job server for Spark
versions 2 and 3. As Alexey said, minor versions are usually compatible as
long as you match major versions.

If you're looking for Docker images:
Spark 2: https://hub.docker.com/r/apache/beam_spark_job_server
Spark 3: https://hub.docker.com/r/apache/beam_spark3_job_server



On Wed, Jan 5, 2022 at 10:29 AM Alexey Romanenko <ar...@gmail.com>
wrote:

> Generally speaking, to avoid the potential issues the versions that are
> used in compile time and in runtime should be the same (most important is
> Scala versions) but, due to Spark backward compatibility, the minor
> versions can be different.
>
> On 5 Jan 2022, at 07:50, Zheng Ni <ni...@gmail.com> wrote:
>
> Hi Beam Community,
>
>
> Greetings.
>
>
> I am interested in submitting a spark job through portable runner. I have
> a question about the compatibility between spark_job_server and spark
> cluster.
>
>
> Let’s say I am going to use beam_spark_job_server of version 2.35.0. How
> could I know which spark cluster version is compatible with it? Or could it
> work with any versions of the spark cluster?
>
>
>
> Regards,
> Zheng
>
>
>

Re: Compatibility of Spark portable runner

Posted by Alexey Romanenko <ar...@gmail.com>.
Generally speaking, to avoid the potential issues the versions that are used in compile time and in runtime should be the same (most important is Scala versions) but, due to Spark backward compatibility, the minor versions can be different.  

> On 5 Jan 2022, at 07:50, Zheng Ni <ni...@gmail.com> wrote:
> 
> Hi Beam Community,
>  
> Greetings.
>  
> I am interested in submitting a spark job through portable runner. I have a question about the compatibility between spark_job_server and spark cluster.
>  
> Let’s say I am going to use beam_spark_job_server of version 2.35.0. How could I know which spark cluster version is compatible with it? Or could it work with any versions of the spark cluster?
>  
>   <>
> Regards,
> Zheng 
>