You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicholas Chammas (JIRA)" <ji...@apache.org> on 2014/10/13 16:19:34 UTC

[jira] [Comment Edited] (SPARK-922) Update Spark AMI to Python 2.7

    [ https://issues.apache.org/jira/browse/SPARK-922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169331#comment-14169331 ] 

Nicholas Chammas edited comment on SPARK-922 at 10/13/14 2:19 PM:
------------------------------------------------------------------

[~joshrosen] - Do you mean [this script|https://github.com/mesos/spark-ec2/blob/v4/create_image.sh]? It doesn't seem to have anything related to Python 2.7.

Anyway, what I meant was if you were open to holding off on updating the Spark AMIs until we had also figured out how to automate that process per [SPARK-3821]. I should have something for that as soon as this week or next.


was (Author: nchammas):
[~joshrosen] - Do you mean [this script|https://github.com/mesos/spark-ec2/blob/v4/create_image.sh]? I doesn't seem to have anything related to Python 2.7.

Anyway, what I meant was if you were open to holding off on updating the Spark AMIs until we had also figured out how to automate that process per [SPARK-3821]. I should have something for that as soon as this week or next.

> Update Spark AMI to Python 2.7
> ------------------------------
>
>                 Key: SPARK-922
>                 URL: https://issues.apache.org/jira/browse/SPARK-922
>             Project: Spark
>          Issue Type: Task
>          Components: EC2, PySpark
>    Affects Versions: 0.9.0, 0.9.1, 1.0.0, 1.1.0
>            Reporter: Josh Rosen
>
> Many Python libraries only support Python 2.7+, so we should make Python 2.7 the default Python on the Spark AMIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org