You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2014/12/20 02:04:07 UTC

[jira] [Resolved] (SPARK-4890) Upgrade Boto to 2.34.0; automatically download Boto from PyPi instead of packaging it

     [ https://issues.apache.org/jira/browse/SPARK-4890?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen resolved SPARK-4890.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 1.3.0

Issue resolved by pull request 3737
[https://github.com/apache/spark/pull/3737]

> Upgrade Boto to 2.34.0; automatically download Boto from PyPi instead of packaging it
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-4890
>                 URL: https://issues.apache.org/jira/browse/SPARK-4890
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>             Fix For: 1.3.0
>
>
> We should upgrade to a newer version of Boto (2.34.0), since this is blocking several features.  It looks like newer versions of Boto don't work properly when they're loaded from a zipfile since they try to read a JSON file from a path relative to the Boto library sources.
> Therefore, I think we should change {{spark-ec2}} to automatically download Boto from PyPi if it's not present in {{SPARK_EC2_DIR/lib}}, similar to what we do in the {{sbt/sbt}} scripts.  This shouldn't ben an issue for users since they already need to have an internet connection to launch an EC2 cluster.  By performing the downloading in {{spark_ec2.py}} instead of the Bash script, this should also work for Windows users.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org