You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/15 14:55:59 UTC

[jira] [Resolved] (SPARK-5241) spark-ec2 spark init scripts do not handle all hadoop (or tachyon?) dependencies correctly

     [ https://issues.apache.org/jira/browse/SPARK-5241?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-5241.
------------------------------
    Resolution: Invalid

I don't understand the problem being reported here. Reopen if you can suggest a particular change. Maybe start by asking on user@ ?

> spark-ec2 spark init scripts do not handle all hadoop (or tachyon?) dependencies correctly
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-5241
>                 URL: https://issues.apache.org/jira/browse/SPARK-5241
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, EC2
>            Reporter: Florian Verhein
>
> spark-ec2/spark/init.sh doesn't completely adhere to hadoop dependencies. This may also be an issue for tachyon dependencies. Related: tachyon appears require builds against the right version of hadoop also (probably causes this: SPARK-3185). 
> Applies to the spark build from git checkout in spark/init.sh (I suspect this should also be changed to using mvn as that's the reference build according to the docs?).
> May apply to pre-built spark in spark/init.sh as well, but I'm not sure about this. E.g. I thought that the hadoop2.4 and cdh4.2 builds of spark are different.
> Also note that hadoop native is built from hadoop 2.4.1 on the AMI, and this is used regardless of HADOOP_MAJOR_VERSION in the *-hdfs modules.
> Tachyon is hard coded to 0.4.1 (which is probably built against hadoop1.x?)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org