You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "张志强(旺轩)" <zz...@alibaba-inc.com> on 2015/11/20 02:16:36 UTC

回复:Dropping support for earlier Hadoop versions in Spark 2.0?

I agreed
+1------------------------------------------------------------------发件人:Reynold Xin<rx...@databricks.com>日 期:2015年11月20日 06:14:44收件人:dev@spark.apache.org<de...@spark.apache.org>; Sean Owen<sr...@gmail.com>; Thomas Graves<tg...@apache.org>主 题:Dropping support for earlier Hadoop versions in Spark 2.0?I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I think everybody is for that.
https://issues.apache.org/jira/browse/SPARK-11807

Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is to say, keep only Hadoop 2.6 and greater.
What are the community's thoughts on that?



Re: Dropping support for earlier Hadoop versions in Spark 2.0?

Posted by Ted Yu <yu...@gmail.com>.
Should a new job be setup under Spark-Master-Maven-with-YARN for hadoop
2.6.x ?

Cheers

On Thu, Nov 19, 2015 at 5:16 PM, 张志强(旺轩) <zz...@alibaba-inc.com> wrote:

> I agreed
> +1
>
> ------------------------------------------------------------------
> 发件人:Reynold Xin<rx...@databricks.com>
> 日 期:2015年11月20日 06:14:44
> 收件人:dev@spark.apache.org<de...@spark.apache.org>; Sean Owen<sr...@gmail.com>;
> Thomas Graves<tg...@apache.org>
> 主 题:Dropping support for earlier Hadoop versions in Spark 2.0?
>
>
> I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I
> think everybody is for that.
>
> https://issues.apache.org/jira/browse/SPARK-11807
>
> Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is
> to say, keep only Hadoop 2.6 and greater.
>
> What are the community's thoughts on that?
>
>
>