You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/04/14 22:41:59 UTC

[jira] [Closed] (SPARK-5808) Assembly generated by sbt does not contain pyspark

     [ https://issues.apache.org/jira/browse/SPARK-5808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Or closed SPARK-5808.
----------------------------
          Resolution: Fixed
       Fix Version/s: 1.4.0
            Assignee: Marcelo Vanzin
    Target Version/s: 1.4.0

> Assembly generated by sbt does not contain pyspark
> --------------------------------------------------
>
>                 Key: SPARK-5808
>                 URL: https://issues.apache.org/jira/browse/SPARK-5808
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.0.0
>            Reporter: Marcelo Vanzin
>            Assignee: Marcelo Vanzin
>            Priority: Minor
>             Fix For: 1.4.0
>
>
> When you generate the assembly with sbt, the py4j and pyspark files are not added to it. This makes pyspark not work when you run that assembly with yarn, since SPARK_HOME is not propagated in Yarn and thus PythonUtils.scala does not add the needed pyspark paths to PYTHONPATH.
> This is minor since all released bits are created by maven, so this should only affect developers who build with sbt and try pyspark on yarn.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org