You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by em...@yeikel.com on 2019/06/12 04:25:01 UTC

What is the compatibility between releases?

Dear Community , 

 

From what I understand , Spark uses a variation of Semantic Versioning[1] ,
but this information is not enough for me to clarify if it is compatible or
not within versions. 

 

For example , if my cluster is running Spark 2.3.1 , can I develop using API
additions in Spark 2.4? (higher order functions to give an  example). What
about the other way around? 

 

Typically , I assume that a job created in Spark 1.x will fail in Spark 2.x
, but that's also something I would like to get a confirmation. 

 

Thank you for your help!

 

[1] https://spark.apache.org/versioning-policy.html 


Re: What is the compatibility between releases?

Posted by Yeikel <em...@yeikel.com>.
Hi Community , 

I am still looking for an answer for this question

I am running a cluster using Spark 2.3.1 , but I wondering if it is safe to
include Spark 2.4.1 and use new features such as higher order functions. 

Thank you.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org