You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by "Ahyoung Ryu (JIRA)" <ji...@apache.org> on 2017/03/22 15:14:41 UTC

[jira] [Created] (ZEPPELIN-2298) Include Pyspark and SparkR in spark-dependencies by default

Ahyoung Ryu created ZEPPELIN-2298:
-------------------------------------

             Summary: Include Pyspark and SparkR in spark-dependencies by default
                 Key: ZEPPELIN-2298
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-2298
             Project: Zeppelin
          Issue Type: Wish
            Reporter: Ahyoung Ryu
             Fix For: 0.8.0


For now, if we want to use Pyspark or SparkR with embedded Spark, we needs to include {{-Ppyspark}} or {{-Psparkr}} or {{-Pr}} when we use Zeppelin by building from source(binary pkg contains them by default). 
Why don't we include them in {{spark-dependencies}} by default so user don't need to care about such build profiles? If there is no special reason, I want to start working to make it. 
Please let me know if someone has thought about this! 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)