You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sean Owen <sr...@gmail.com> on 2018/11/08 18:46:48 UTC

On Java 9+ support, Cleaners, modules and the death of reflection

I think this is a key thread, perhaps one of the only big problems,
for Java 9+ support:

https://issues.apache.org/jira/browse/SPARK-24421?focusedCommentId=16680169&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16680169

We basically can't access a certain method (Cleaner.clean()) anymore
without the user adding JVM flags to allow it. As far as I can tell.
I'm working out what the alternatives or implications are.

Thoughts welcome.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: On Java 9+ support, Cleaners, modules and the death of reflection

Posted by Sean Owen <sr...@gmail.com>.
For those following, I have a PR up at
https://github.com/apache/spark/pull/22993

The implication is that ignoring MaxDirectMemorySize doesn't work out
of the box in Java 9+ now. However, you can make it work by setting
JVM flags to allow access to the new Cleaner class. Or set
MaxDirectMemorySize if it's an issue. Or do nothing if you don't
actually run up against the MaxDirectMemorySize limit, which seems to
default to equal the size of the JVM heap.

On Thu, Nov 8, 2018 at 12:46 PM Sean Owen <sr...@gmail.com> wrote:
>
> I think this is a key thread, perhaps one of the only big problems,
> for Java 9+ support:
>
> https://issues.apache.org/jira/browse/SPARK-24421?focusedCommentId=16680169&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16680169
>
> We basically can't access a certain method (Cleaner.clean()) anymore
> without the user adding JVM flags to allow it. As far as I can tell.
> I'm working out what the alternatives or implications are.
>
> Thoughts welcome.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org