You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "M. Le Bihan (JIRA)" <ji...@apache.org> on 2018/12/06 13:34:00 UTC

[jira] [Created] (SPARK-26296) Spark build over Java and not over Scala, offering Scala as an option over Spark

M. Le Bihan created SPARK-26296:
-----------------------------------

             Summary: Spark build over Java and not over Scala, offering Scala as an option over Spark
                 Key: SPARK-26296
                 URL: https://issues.apache.org/jira/browse/SPARK-26296
             Project: Spark
          Issue Type: Wish
          Components: Spark Core
    Affects Versions: 2.4.0
            Reporter: M. Le Bihan


I am not using _Scala_ when I am programming _Spark_ but plain _Java_. I believe those using _PySpark_ no more. 

 

But Spark as been build over _Scala_ instead of plain _Java_ and its a cause troubles, especially at the time of leveling JDK. We are awaiting to use JDK 11 and _Scala_ is still lowering _Spark_ in the previous version of the JDK.



Having a _Spark_ without _Scala_, like it is possible to have a _Spark_ without _Hadoop,_ would confort me : a cause of issues would disappear.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org