You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ramesh Mathikumar <me...@googlemail.com.INVALID> on 2020/08/16 20:55:19 UTC

Spark - Scala-Java interoperablity

Hi Team,

A quick question from my side.

Can I use spark-submit which contains both java and scala in a single
workflow. By single workflow I mean main program is in Java (Wrapped
in Spark) and it calls a module to calculate something on the payload
which is in Scala (wrapped in Spark).

Are there any compatibility / interoperability issues around?

Regards,
Ramster

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Spark - Scala-Java interoperablity

Posted by Sean Owen <sr...@gmail.com>.
That should be fine. The JVM doesn't care how the bytecode it is
executing was produced. As long as you were able to compile it
together - which sometimes means using plugins like scala-maven-plugin
for mixed compilation - the result should be fine.

On Sun, Aug 16, 2020 at 4:28 PM Ramesh Mathikumar
<me...@googlemail.com.invalid> wrote:
>
> Hi Team,
>
> A quick question from my side.
>
> Can I use spark-submit which contains both java and scala in a single
> workflow. By single workflow I mean main program is in Java (Wrapped
> in Spark) and it calls a module to calculate something on the payload
> which is in Scala (wrapped in Spark).
>
> Are there any compatibility / interoperability issues around?
>
> Regards,
> Ramster
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org