You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/06/22 05:18:02 UTC
[jira] [Assigned] (SPARK-21170)
Utils.tryWithSafeFinallyAndFailureCallbacks throws
IllegalArgumentException: Self-suppression not permitted
[ https://issues.apache.org/jira/browse/SPARK-21170?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-21170:
------------------------------------
Assignee: Apache Spark
> Utils.tryWithSafeFinallyAndFailureCallbacks throws IllegalArgumentException: Self-suppression not permitted
> -----------------------------------------------------------------------------------------------------------
>
> Key: SPARK-21170
> URL: https://issues.apache.org/jira/browse/SPARK-21170
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.1.1
> Reporter: Devaraj K
> Assignee: Apache Spark
> Priority: Minor
>
> {code:xml}
> 17/06/20 22:49:39 ERROR Executor: Exception in task 225.0 in stage 1.0 (TID 27225)
> java.lang.IllegalArgumentException: Self-suppression not permitted
> at java.lang.Throwable.addSuppressed(Throwable.java:1043)
> at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1400)
> at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1145)
> at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1125)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> at org.apache.spark.scheduler.Task.run(Task.scala:108)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:341)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> {code}
> {code:xml}
> 17/06/20 22:52:32 INFO scheduler.TaskSetManager: Lost task 427.0 in stage 1.0 (TID 27427) on 192.168.1.121, executor 12: java.lang.IllegalArgumentException (Self-suppression not permitted) [duplicate 1]
> 17/06/20 22:52:33 INFO scheduler.TaskSetManager: Starting task 427.1 in stage 1.0 (TID 27764, 192.168.1.122, executor 106, partition 427, PROCESS_LOCAL, 4625 bytes)
> 17/06/20 22:52:33 INFO scheduler.TaskSetManager: Lost task 186.0 in stage 1.0 (TID 27186) on 192.168.1.122, executor 106: java.lang.IllegalArgumentException (Self-suppression not permitted) [duplicate 2]
> 17/06/20 22:52:38 INFO scheduler.TaskSetManager: Starting task 186.1 in stage 1.0 (TID 27765, 192.168.1.121, executor 9, partition 186, PROCESS_LOCAL, 4625 bytes)
> 17/06/20 22:52:38 WARN scheduler.TaskSetManager: Lost task 392.0 in stage 1.0 (TID 27392, 192.168.1.121, executor 9): java.lang.IllegalArgumentException: Self-suppression not permitted
> at java.lang.Throwable.addSuppressed(Throwable.java:1043)
> at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1400)
> at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1145)
> at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1125)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> at org.apache.spark.scheduler.Task.run(Task.scala:108)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:341)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> {code}
> Here it is trying to suppress the same Throwable instance and causing to throw the IllegalArgumentException which masks the original exception.
> I think it should not add to the suppressed if it is the same instance.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org