You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jeff Zhang (JIRA)" <ji...@apache.org> on 2016/03/01 05:58:18 UTC

[jira] [Created] (SPARK-13587) Support virtualenv in PySpark

Jeff Zhang created SPARK-13587:
----------------------------------

             Summary: Support virtualenv in PySpark
                 Key: SPARK-13587
                 URL: https://issues.apache.org/jira/browse/SPARK-13587
             Project: Spark
          Issue Type: Improvement
          Components: PySpark
            Reporter: Jeff Zhang


Currently, it's not easy for user to add third party python packages in pyspark.
* One way is to using --py-files (suitable for simple dependency, but not suitable for complicated dependency, especially with transitive dependency)
* Another way is install packages manually on each node (time wasting, and not easy to switch to different environment)

Python has now 2 different virtualenv implementation. One is native virtualenv another is through conda. This jira is trying to migrate these 2 tools to distributed environment



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org