You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by raghavendran_c <ra...@hotmail.com> on 2017/05/12 17:00:46 UTC

Question on whether to use Java 8 or Scala for writing Spark applications

Hi,

Our organization is a Java shop. All of our developers are used to Java 7,
and are gearing up to migrate to Java 8. They have a fair knowledge of
Hadoop and Map-Reduce and are planning to learn Spark.

With Java 8 available (with its Lambda expression conciseness), is it still
beneficial to write Spark applications in Scala? Or we will be better served
if we stick to Spark + Java 8 (rather than spend the effort to learn Scala
in addition to Spark).

One of the nagging worries about using Java 8 for spark is that we think
there will be a lag in the API Bindings for Java when newer versions of
Spark come out (and that those bindings will be available quicker for
Scala). Is this worry valid?

Also, about Scala, we are not sure about stuff like Static Code Analysis
tools and Unit Test Code Coverage tools for Scala. Hence we are thinking of
sticking with Java 8 for Spark.

Any advice on the above will be much appreciated.

thanks,
Raga



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Question-on-whether-to-use-Java-8-or-Scala-for-writing-Spark-applications-tp28681.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org