You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2018/11/13 07:08:00 UTC

***UNCHECKED*** [jira] [Updated] (SPARK-25986) Banning throw new OutOfMemoryErrors

     [ https://issues.apache.org/jira/browse/SPARK-25986?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-25986:
------------------------------
    Target Version/s: 3.0.0
              Labels: release-notes  (was: starter)

Per the pull request, I propose we expand this to include all Error subclasses. In almost all cases it's the wrong thing to throw, as they are meant to represent internal JVM errors. This could cause user-visible changes because a different exception is thrown, so can only happen for 3.0, but, that seems like a good place to do it.

> Banning throw new OutOfMemoryErrors
> -----------------------------------
>
>                 Key: SPARK-25986
>                 URL: https://issues.apache.org/jira/browse/SPARK-25986
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 3.0.0
>            Reporter: Xiao Li
>            Priority: Major
>              Labels: release-notes
>
> Adding a linter rule to ban the construction of new OutOfMemoryErrors and then make sure that we throw the correct exceptions. See the PR https://github.com/apache/spark/pull/22969



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org