You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "buckhx (JIRA)" <ji...@apache.org> on 2015/10/19 19:31:05 UTC

[jira] [Commented] (SPARK-5929) Pyspark: Register a pip requirements file with spark_context

    [ https://issues.apache.org/jira/browse/SPARK-5929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14963655#comment-14963655 ] 

buckhx commented on SPARK-5929:
-------------------------------

I also included an add module that will bundle and ship a module that has already been imported by the driver

> Pyspark: Register a pip requirements file with spark_context
> ------------------------------------------------------------
>
>                 Key: SPARK-5929
>                 URL: https://issues.apache.org/jira/browse/SPARK-5929
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: buckhx
>            Priority: Minor
>
> I've been doing a lot of dependency work with shipping dependencies to workers as it is non-trivial for me to have my workers include the proper dependencies in their own environments.
> To circumvent this, I added a addRequirementsFile() method that takes a pip requirements file, downloads the packages, repackages them to be registered with addPyFiles and ship them to workers.
> Here is a comparison of what I've done on the Palantir fork 
> https://github.com/buckheroux/spark/compare/palantir:master...master



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org