You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Zhang, Liyun" <li...@intel.com> on 2017/10/27 02:21:39 UTC

Anyone knows how to build and spark on jdk9?

Hi all:
1.       I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I search for jiras related to JDK9. I only found SPARK-13278<https://issues.apache.org/jira/browse/SPARK-13278>.  This means now spark can build or run successfully on JDK9 ?


Best Regards
Kelly Zhang/Zhang,Liyun


Re: Anyone knows how to build and spark on jdk9?

Posted by Steve Loughran <st...@hortonworks.com>.
On 27 Oct 2017, at 19:24, Sean Owen <so...@cloudera.com>> wrote:

Certainly, Scala 2.12 support precedes Java 9 support. A lot of the work is in place already, and the last issue is dealing with how Scala closures are now implemented quite different with lambdas / invokedynamic. This affects the ClosureCleaner. For the interested, this is as far as I know the main remaining issue:

Despite the odd naming, all of these versions of Java are successors to Java 9. Supporting any of them is probably the same thing, so, the work is still for now getting it working on Java 9.

Whereas Java has been very backwards-compatible in the past, the new module structure is almost certain to break something in Spark or its dependencies. Removing JAXB from the JDK alone causes issues. Getting it to run at all on Java 9 may require changes, whereas compatibility with new Java major releases in the past generally came for free. It'll be worth trying to make that happen soonish. I'm guessing for Spark 3.x in first half of next year?


it is going to be traumatic across the stack, but it's probably best starting it as a background activity, just to be aware of what's going to work and where the trouble is(*).

But, first things first. Scala 2.12 support.

On Fri, Oct 27, 2017 at 6:02 PM Jörn Franke <jo...@gmail.com>> wrote:
Scala 2.12 is not yet supported on Spark - this means also not JDK9:
https://issues.apache.org/jira/plugins/servlet/mobile#issue/SPARK-14220

If you look at the Oracle support then jdk 9 is anyway only supported for 6 months. JDK 8 is Lts (5 years) JDK 18.3 will be only 6 months and JDK 18.9 is lts (5 years).
http://www.oracle.com/technetwork/java/eol-135779.html

I do not think Spark should support non-lts releases. Especially for JDK9 I do not see a strong technical need, but maybe I am overlooking something. Of course http2 etc would be nice for the web interfaces, but currently not very urgent.


Oracle's new retirement strategy is "odd": it'll essentially be killing java 0 updates before java 8, and retiring java 8 as the same time as the march '18 release. Like you say, not very motiviational for an update.

At the same time: Java 8 is going away, and at some point the move to the new versions will be needed, even if the new version isn't JDK9 itself. It's generally helpful to be a bit proactive, especially getting all the dependencies bumped up, sorting out build & test. The real enemy is any incompatible change needed in the code, or something which breaks public/stable APIs. That and some dependency on a library which is not compatible with java 9 and which lacks a replacement. Either you take on the maintenance yourself (bad),  or you do the migration.




(*) I predict "Kerberos". it's always Kerberos. A move to a "per-app JRE will complicate enabling full length bit encryption", as the ASF isn't going to be able to ship the extended crypto JAR needed for Kerberos ad 256 bit keys.

-Steve

Re: Anyone knows how to build and spark on jdk9?

Posted by Steve Loughran <st...@hortonworks.com>.
On 27 Oct 2017, at 19:24, Sean Owen <so...@cloudera.com>> wrote:

Certainly, Scala 2.12 support precedes Java 9 support. A lot of the work is in place already, and the last issue is dealing with how Scala closures are now implemented quite different with lambdas / invokedynamic. This affects the ClosureCleaner. For the interested, this is as far as I know the main remaining issue:

Despite the odd naming, all of these versions of Java are successors to Java 9. Supporting any of them is probably the same thing, so, the work is still for now getting it working on Java 9.

Whereas Java has been very backwards-compatible in the past, the new module structure is almost certain to break something in Spark or its dependencies. Removing JAXB from the JDK alone causes issues. Getting it to run at all on Java 9 may require changes, whereas compatibility with new Java major releases in the past generally came for free. It'll be worth trying to make that happen soonish. I'm guessing for Spark 3.x in first half of next year?


it is going to be traumatic across the stack, but it's probably best starting it as a background activity, just to be aware of what's going to work and where the trouble is(*).

But, first things first. Scala 2.12 support.

On Fri, Oct 27, 2017 at 6:02 PM Jörn Franke <jo...@gmail.com>> wrote:
Scala 2.12 is not yet supported on Spark - this means also not JDK9:
https://issues.apache.org/jira/plugins/servlet/mobile#issue/SPARK-14220

If you look at the Oracle support then jdk 9 is anyway only supported for 6 months. JDK 8 is Lts (5 years) JDK 18.3 will be only 6 months and JDK 18.9 is lts (5 years).
http://www.oracle.com/technetwork/java/eol-135779.html

I do not think Spark should support non-lts releases. Especially for JDK9 I do not see a strong technical need, but maybe I am overlooking something. Of course http2 etc would be nice for the web interfaces, but currently not very urgent.


Oracle's new retirement strategy is "odd": it'll essentially be killing java 0 updates before java 8, and retiring java 8 as the same time as the march '18 release. Like you say, not very motiviational for an update.

At the same time: Java 8 is going away, and at some point the move to the new versions will be needed, even if the new version isn't JDK9 itself. It's generally helpful to be a bit proactive, especially getting all the dependencies bumped up, sorting out build & test. The real enemy is any incompatible change needed in the code, or something which breaks public/stable APIs. That and some dependency on a library which is not compatible with java 9 and which lacks a replacement. Either you take on the maintenance yourself (bad),  or you do the migration.




(*) I predict "Kerberos". it's always Kerberos. A move to a "per-app JRE will complicate enabling full length bit encryption", as the ASF isn't going to be able to ship the extended crypto JAR needed for Kerberos ad 256 bit keys.

-Steve

Re: Anyone knows how to build and spark on jdk9?

Posted by Sean Owen <so...@cloudera.com>.
Certainly, Scala 2.12 support precedes Java 9 support. A lot of the work is
in place already, and the last issue is dealing with how Scala closures are
now implemented quite different with lambdas / invokedynamic. This affects
the ClosureCleaner. For the interested, this is as far as I know the main
remaining issue:

Despite the odd naming, all of these versions of Java are successors to
Java 9. Supporting any of them is probably the same thing, so, the work is
still for now getting it working on Java 9.

Whereas Java has been very backwards-compatible in the past, the new module
structure is almost certain to break something in Spark or its
dependencies. Removing JAXB from the JDK alone causes issues. Getting it to
run at all on Java 9 may require changes, whereas compatibility with new
Java major releases in the past generally came for free. It'll be worth
trying to make that happen soonish. I'm guessing for Spark 3.x in first
half of next year?

But, first things first. Scala 2.12 support.

On Fri, Oct 27, 2017 at 6:02 PM Jörn Franke <jo...@gmail.com> wrote:

> Scala 2.12 is not yet supported on Spark - this means also not JDK9:
> https://issues.apache.org/jira/plugins/servlet/mobile#issue/SPARK-14220
>
> If you look at the Oracle support then jdk 9 is anyway only supported for
> 6 months. JDK 8 is Lts (5 years) JDK 18.3 will be only 6 months and JDK
> 18.9 is lts (5 years).
> http://www.oracle.com/technetwork/java/eol-135779.html
>
> I do not think Spark should support non-lts releases. Especially for JDK9
> I do not see a strong technical need, but maybe I am overlooking something.
> Of course http2 etc would be nice for the web interfaces, but currently not
> very urgent.
>
> On 27. Oct 2017, at 04:44, Zhang, Liyun <li...@intel.com> wrote:
>
> Thanks your suggestion, seems that scala 2.12.4 support jdk9
>
>
>
> Scala 2.12.4 <https://github.com/scala/scala/releases/tag/v2.12.4> is now
> available.
>
> Our benchmarks
> <https://scala-ci.typesafe.com/grafana/dashboard/db/scala-benchmark?var-branch=2.12.x&from=1501580691158&to=1507711932006> show
> a further reduction in compile times since 2.12.3 of 5-10%.
>
> Improved Java 9 friendliness, with more to come!
>
>
>
> Best Regards
>
> Kelly Zhang/Zhang,Liyun
>
>
>
>
>
>
>
>
>
>
>
> *From:* Reynold Xin [mailto:rxin@databricks.com <rx...@databricks.com>]
> *Sent:* Friday, October 27, 2017 10:26 AM
> *To:* Zhang, Liyun <li...@intel.com>; dev@spark.apache.org;
> user@spark.apache.org
> *Subject:* Re: Anyone knows how to build and spark on jdk9?
>
>
>
> It probably depends on the Scala version we use in Spark supporting Java 9
> first.
>
>
>
> On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com>
> wrote:
>
> Hi all:
>
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9
> env. I search for jiras related to JDK9. I only found SPARK-13278
> <https://issues.apache.org/jira/browse/SPARK-13278>.  This means now
> spark can build or run successfully on JDK9 ?
>
>
>
>
>
> Best Regards
>
> Kelly Zhang/Zhang,Liyun
>
>
>
>

Re: Anyone knows how to build and spark on jdk9?

Posted by Sean Owen <so...@cloudera.com>.
Certainly, Scala 2.12 support precedes Java 9 support. A lot of the work is
in place already, and the last issue is dealing with how Scala closures are
now implemented quite different with lambdas / invokedynamic. This affects
the ClosureCleaner. For the interested, this is as far as I know the main
remaining issue:

Despite the odd naming, all of these versions of Java are successors to
Java 9. Supporting any of them is probably the same thing, so, the work is
still for now getting it working on Java 9.

Whereas Java has been very backwards-compatible in the past, the new module
structure is almost certain to break something in Spark or its
dependencies. Removing JAXB from the JDK alone causes issues. Getting it to
run at all on Java 9 may require changes, whereas compatibility with new
Java major releases in the past generally came for free. It'll be worth
trying to make that happen soonish. I'm guessing for Spark 3.x in first
half of next year?

But, first things first. Scala 2.12 support.

On Fri, Oct 27, 2017 at 6:02 PM Jörn Franke <jo...@gmail.com> wrote:

> Scala 2.12 is not yet supported on Spark - this means also not JDK9:
> https://issues.apache.org/jira/plugins/servlet/mobile#issue/SPARK-14220
>
> If you look at the Oracle support then jdk 9 is anyway only supported for
> 6 months. JDK 8 is Lts (5 years) JDK 18.3 will be only 6 months and JDK
> 18.9 is lts (5 years).
> http://www.oracle.com/technetwork/java/eol-135779.html
>
> I do not think Spark should support non-lts releases. Especially for JDK9
> I do not see a strong technical need, but maybe I am overlooking something.
> Of course http2 etc would be nice for the web interfaces, but currently not
> very urgent.
>
> On 27. Oct 2017, at 04:44, Zhang, Liyun <li...@intel.com> wrote:
>
> Thanks your suggestion, seems that scala 2.12.4 support jdk9
>
>
>
> Scala 2.12.4 <https://github.com/scala/scala/releases/tag/v2.12.4> is now
> available.
>
> Our benchmarks
> <https://scala-ci.typesafe.com/grafana/dashboard/db/scala-benchmark?var-branch=2.12.x&from=1501580691158&to=1507711932006> show
> a further reduction in compile times since 2.12.3 of 5-10%.
>
> Improved Java 9 friendliness, with more to come!
>
>
>
> Best Regards
>
> Kelly Zhang/Zhang,Liyun
>
>
>
>
>
>
>
>
>
>
>
> *From:* Reynold Xin [mailto:rxin@databricks.com <rx...@databricks.com>]
> *Sent:* Friday, October 27, 2017 10:26 AM
> *To:* Zhang, Liyun <li...@intel.com>; dev@spark.apache.org;
> user@spark.apache.org
> *Subject:* Re: Anyone knows how to build and spark on jdk9?
>
>
>
> It probably depends on the Scala version we use in Spark supporting Java 9
> first.
>
>
>
> On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com>
> wrote:
>
> Hi all:
>
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9
> env. I search for jiras related to JDK9. I only found SPARK-13278
> <https://issues.apache.org/jira/browse/SPARK-13278>.  This means now
> spark can build or run successfully on JDK9 ?
>
>
>
>
>
> Best Regards
>
> Kelly Zhang/Zhang,Liyun
>
>
>
>

Re: Anyone knows how to build and spark on jdk9?

Posted by Jörn Franke <jo...@gmail.com>.
Scala 2.12 is not yet supported on Spark - this means also not JDK9:
https://issues.apache.org/jira/plugins/servlet/mobile#issue/SPARK-14220

If you look at the Oracle support then jdk 9 is anyway only supported for 6 months. JDK 8 is Lts (5 years) JDK 18.3 will be only 6 months and JDK 18.9 is lts (5 years).
http://www.oracle.com/technetwork/java/eol-135779.html

I do not think Spark should support non-lts releases. Especially for JDK9 I do not see a strong technical need, but maybe I am overlooking something. Of course http2 etc would be nice for the web interfaces, but currently not very urgent. 

> On 27. Oct 2017, at 04:44, Zhang, Liyun <li...@intel.com> wrote:
> 
> Thanks your suggestion, seems that scala 2.12.4 support jdk9
>  
> Scala 2.12.4 is now available.
> 
> Our benchmarks show a further reduction in compile times since 2.12.3 of 5-10%.
> 
> Improved Java 9 friendliness, with more to come!
> 
>  
> Best Regards
> Kelly Zhang/Zhang,Liyun
>  
>  
>  
>  
>  
> From: Reynold Xin [mailto:rxin@databricks.com] 
> Sent: Friday, October 27, 2017 10:26 AM
> To: Zhang, Liyun <li...@intel.com>; dev@spark.apache.org; user@spark.apache.org
> Subject: Re: Anyone knows how to build and spark on jdk9?
>  
> It probably depends on the Scala version we use in Spark supporting Java 9 first. 
>  
> On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com> wrote:
> Hi all:
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I search for jiras related to JDK9. I only found SPARK-13278.  This means now spark can build or run successfully on JDK9 ?
>  
>  
> Best Regards
> Kelly Zhang/Zhang,Liyun
>  

Re: Anyone knows how to build and spark on jdk9?

Posted by Jean Georges Perrin <jg...@jgp.net>.
May I ask what is the use case? Although it is a very interesting question, but I would be concerned about going further than a proof of concept. A lot of the enterprises I see and visit are barely on Java8, so starting to talk JDK 9 might be a slight overkill but if you have a good story, I’m all for it!

jg


> On Oct 27, 2017, at 03:44, Zhang, Liyun <li...@intel.com> wrote:
> 
> Thanks your suggestion, seems that scala 2.12.4 support jdk9
>  
> Scala 2.12.4 is now available.
> 
> Our benchmarks show a further reduction in compile times since 2.12.3 of 5-10%.
> 
> Improved Java 9 friendliness, with more to come!
> 
>  
> Best Regards
> Kelly Zhang/Zhang,Liyun
>  
>  
>  
>  
>  
> From: Reynold Xin [mailto:rxin@databricks.com] 
> Sent: Friday, October 27, 2017 10:26 AM
> To: Zhang, Liyun <li...@intel.com>; dev@spark.apache.org; user@spark.apache.org
> Subject: Re: Anyone knows how to build and spark on jdk9?
>  
> It probably depends on the Scala version we use in Spark supporting Java 9 first. 
>  
> On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com> wrote:
> Hi all:
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I search for jiras related to JDK9. I only found SPARK-13278.  This means now spark can build or run successfully on JDK9 ?
>  
>  
> Best Regards
> Kelly Zhang/Zhang,Liyun
>  

Re: Anyone knows how to build and spark on jdk9?

Posted by Jörn Franke <jo...@gmail.com>.
Scala 2.12 is not yet supported on Spark - this means also not JDK9:
https://issues.apache.org/jira/plugins/servlet/mobile#issue/SPARK-14220

If you look at the Oracle support then jdk 9 is anyway only supported for 6 months. JDK 8 is Lts (5 years) JDK 18.3 will be only 6 months and JDK 18.9 is lts (5 years).
http://www.oracle.com/technetwork/java/eol-135779.html

I do not think Spark should support non-lts releases. Especially for JDK9 I do not see a strong technical need, but maybe I am overlooking something. Of course http2 etc would be nice for the web interfaces, but currently not very urgent. 

> On 27. Oct 2017, at 04:44, Zhang, Liyun <li...@intel.com> wrote:
> 
> Thanks your suggestion, seems that scala 2.12.4 support jdk9
>  
> Scala 2.12.4 is now available.
> 
> Our benchmarks show a further reduction in compile times since 2.12.3 of 5-10%.
> 
> Improved Java 9 friendliness, with more to come!
> 
>  
> Best Regards
> Kelly Zhang/Zhang,Liyun
>  
>  
>  
>  
>  
> From: Reynold Xin [mailto:rxin@databricks.com] 
> Sent: Friday, October 27, 2017 10:26 AM
> To: Zhang, Liyun <li...@intel.com>; dev@spark.apache.org; user@spark.apache.org
> Subject: Re: Anyone knows how to build and spark on jdk9?
>  
> It probably depends on the Scala version we use in Spark supporting Java 9 first. 
>  
> On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com> wrote:
> Hi all:
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I search for jiras related to JDK9. I only found SPARK-13278.  This means now spark can build or run successfully on JDK9 ?
>  
>  
> Best Regards
> Kelly Zhang/Zhang,Liyun
>  

RE: Anyone knows how to build and spark on jdk9?

Posted by "Zhang, Liyun" <li...@intel.com>.
Thanks your suggestion, seems that scala 2.12.4 support jdk9


Scala 2.12.4<https://github.com/scala/scala/releases/tag/v2.12.4> is now available.

Our benchmarks<https://scala-ci.typesafe.com/grafana/dashboard/db/scala-benchmark?var-branch=2.12.x&from=1501580691158&to=1507711932006> show a further reduction in compile times since 2.12.3 of 5-10%.

Improved Java 9 friendliness, with more to come!

Best Regards
Kelly Zhang/Zhang,Liyun





From: Reynold Xin [mailto:rxin@databricks.com]
Sent: Friday, October 27, 2017 10:26 AM
To: Zhang, Liyun <li...@intel.com>; dev@spark.apache.org; user@spark.apache.org
Subject: Re: Anyone knows how to build and spark on jdk9?

It probably depends on the Scala version we use in Spark supporting Java 9 first.

On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com>> wrote:
Hi all:
1.       I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I search for jiras related to JDK9. I only found SPARK-13278<https://issues.apache.org/jira/browse/SPARK-13278>.  This means now spark can build or run successfully on JDK9 ?


Best Regards
Kelly Zhang/Zhang,Liyun


RE: Anyone knows how to build and spark on jdk9?

Posted by "Zhang, Liyun" <li...@intel.com>.
Thanks your suggestion, seems that scala 2.12.4 support jdk9


Scala 2.12.4<https://github.com/scala/scala/releases/tag/v2.12.4> is now available.

Our benchmarks<https://scala-ci.typesafe.com/grafana/dashboard/db/scala-benchmark?var-branch=2.12.x&from=1501580691158&to=1507711932006> show a further reduction in compile times since 2.12.3 of 5-10%.

Improved Java 9 friendliness, with more to come!

Best Regards
Kelly Zhang/Zhang,Liyun





From: Reynold Xin [mailto:rxin@databricks.com]
Sent: Friday, October 27, 2017 10:26 AM
To: Zhang, Liyun <li...@intel.com>; dev@spark.apache.org; user@spark.apache.org
Subject: Re: Anyone knows how to build and spark on jdk9?

It probably depends on the Scala version we use in Spark supporting Java 9 first.

On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com>> wrote:
Hi all:
1.       I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I search for jiras related to JDK9. I only found SPARK-13278<https://issues.apache.org/jira/browse/SPARK-13278>.  This means now spark can build or run successfully on JDK9 ?


Best Regards
Kelly Zhang/Zhang,Liyun


Re: Anyone knows how to build and spark on jdk9?

Posted by Reynold Xin <rx...@databricks.com>.
It probably depends on the Scala version we use in Spark supporting Java 9
first.

On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com> wrote:

> Hi all:
>
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9
> env. I search for jiras related to JDK9. I only found SPARK-13278
> <https://issues.apache.org/jira/browse/SPARK-13278>.  This means now
> spark can build or run successfully on JDK9 ?
>
>
>
>
>
> Best Regards
>
> Kelly Zhang/Zhang,Liyun
>
>
>

Re: Anyone knows how to build and spark on jdk9?

Posted by Reynold Xin <rx...@databricks.com>.
It probably depends on the Scala version we use in Spark supporting Java 9
first.

On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <li...@intel.com> wrote:

> Hi all:
>
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9
> env. I search for jiras related to JDK9. I only found SPARK-13278
> <https://issues.apache.org/jira/browse/SPARK-13278>.  This means now
> spark can build or run successfully on JDK9 ?
>
>
>
>
>
> Best Regards
>
> Kelly Zhang/Zhang,Liyun
>
>
>

Re: Anyone knows how to build and spark on jdk9?

Posted by Vadim Semenov <va...@datadoghq.com>.
If someone else is looking how to try jdk9, you can just pass your own
JAVA_HOME environment variables:

spark.yarn.appMasterEnv.JAVA_HOME=/usr/lib/jvm/java-1.8.0

spark.executorEnv.JAVA_HOME=/usr/lib/jvm/java-1.8.0


On Fri, Oct 27, 2017 at 5:14 AM, Steve Loughran <st...@hortonworks.com>
wrote:

>
> On 27 Oct 2017, at 03:21, Zhang, Liyun <li...@intel.com> wrote:
>
> Hi all:
> 1.       I want to build spark on jdk9 and test it with Hadoop on jdk9
> env. I search for jiras related to JDK9. I only found SPARK-13278
> <https://issues.apache.org/jira/browse/SPARK-13278>.  This means now
> spark can build or run successfully on JDK9 ?
>
>
> Best Regards
> Kelly Zhang/Zhang,Liyun
>
>
>
> Don't know about spark itself, but I do know that getting Hadoop on JDK9
> is still a WiP, primarily because they've locked it down so much (good)
> including the things hadoop gets at to make things like kerberos auth work
> (bad)
>
> https://issues.apache.org/jira/browse/HADOOP-11123
>
> A large part of the issues are with moving its dependencies to Java 9
> compatible ones (Log4J, mockito, JUnit) —all of those which only surface in
> testing and in the build itself won't be relevant for Spark standalone.
> Other than that, YARN doesn't work
>
> Most of the work has been done by one person (Akira @ NEC); if there are
> other people willing to help, including building & testing Spark against
> (locally built) JDK9 Hadoop artifacts life would be better. We could maybe
> build & release some alpha-quality Hadoop 3.1.x-alpha-JDK9 artifacts if
> that would help
>
> FWIW, there's long been some background chatter between the old Sun JDK
> team & the ASF big data stack devs; modules is something wonderful which
> will kill the need for shading and reduce/eliminate classpath hell. Earlier
> on there was some discussion about having proper 2d arrays & direct memory
> access of some structures, but that's not in this version. Give it time.
>
> Oracle are being aggressive about retiring Java 8: by Sept 2018 they plan
> to not provide public updates for it. Which means building against Java 9
> dev time is here for everyone
>

Re: Anyone knows how to build and spark on jdk9?

Posted by Steve Loughran <st...@hortonworks.com>.
On 27 Oct 2017, at 03:21, Zhang, Liyun <li...@intel.com>> wrote:

Hi all:
1.       I want to build spark on jdk9 and test it with Hadoop on jdk9 env. I search for jiras related to JDK9. I only found SPARK-13278<https://issues.apache.org/jira/browse/SPARK-13278>.  This means now spark can build or run successfully on JDK9 ?


Best Regards
Kelly Zhang/Zhang,Liyun


Don't know about spark itself, but I do know that getting Hadoop on JDK9 is still a WiP, primarily because they've locked it down so much (good) including the things hadoop gets at to make things like kerberos auth work (bad)

https://issues.apache.org/jira/browse/HADOOP-11123

A large part of the issues are with moving its dependencies to Java 9 compatible ones (Log4J, mockito, JUnit) —all of those which only surface in testing and in the build itself won't be relevant for Spark standalone. Other than that, YARN doesn't work

Most of the work has been done by one person (Akira @ NEC); if there are other people willing to help, including building & testing Spark against (locally built) JDK9 Hadoop artifacts life would be better. We could maybe build & release some alpha-quality Hadoop 3.1.x-alpha-JDK9 artifacts if that would help

FWIW, there's long been some background chatter between the old Sun JDK team & the ASF big data stack devs; modules is something wonderful which will kill the need for shading and reduce/eliminate classpath hell. Earlier on there was some discussion about having proper 2d arrays & direct memory access of some structures, but that's not in this version. Give it time.

Oracle are being aggressive about retiring Java 8: by Sept 2018 they plan to not provide public updates for it. Which means building against Java 9 dev time is here for everyone