You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/04/11 01:27:13 UTC

[jira] [Assigned] (SPARK-5808) Assembly generated by sbt does not contain pyspark

     [ https://issues.apache.org/jira/browse/SPARK-5808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-5808:
-----------------------------------

    Assignee: Apache Spark

> Assembly generated by sbt does not contain pyspark
> --------------------------------------------------
>
>                 Key: SPARK-5808
>                 URL: https://issues.apache.org/jira/browse/SPARK-5808
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>            Reporter: Marcelo Vanzin
>            Assignee: Apache Spark
>            Priority: Minor
>
> When you generate the assembly with sbt, the py4j and pyspark files are not added to it. This makes pyspark not work when you run that assembly with yarn, since SPARK_HOME is not propagated in Yarn and thus PythonUtils.scala does not add the needed pyspark paths to PYTHONPATH.
> This is minor since all released bits are created by maven, so this should only affect developers who build with sbt and try pyspark on yarn.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org