You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by t4 <re...@hotmail.com> on 2018/10/16 21:06:16 UTC

Re: Hadoop 3 support

has anyone got spark jars working with hadoop3.1 that they can share? i am
looking to be able to use the latest  hadoop-aws fixes from v3.1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Hadoop 3 support

Posted by Hyukjin Kwon <gu...@gmail.com>.
See the discussion at https://github.com/apache/spark/pull/21588

2018년 10월 17일 (수) 오전 5:06, t4 <re...@hotmail.com>님이 작성:

> has anyone got spark jars working with hadoop3.1 that they can share? i am
> looking to be able to use the latest  hadoop-aws fixes from v3.1
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Hadoop 3 support

Posted by Steve Loughran <st...@hortonworks.com>.

> On 16 Oct 2018, at 22:06, t4 <re...@hotmail.com> wrote:
> 
> has anyone got spark jars working with hadoop3.1 that they can share? i am
> looking to be able to use the latest  hadoop-aws fixes from v3.1

we do, but we do with

*  a patched hive JAR
* bulding spark with -Phive,yarn,hadoop-3.1,hadoop-cloud,kinesis  profiles to pull in the object store stuff *while leaving out the things which cause conflict*
* some extra stuff to wire up the 0-rename-committer

w.r.t hadoop aws, the hadoop-2.9 artifacts have the shaded aws JAR; 50 MB of .class to avoid jackson dependency pain, and an early version of S3Guard. For the new commit stuff you will need to go to hadoop 3.1

-steve



> 
> 
> 
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org