You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Deej (JIRA)" <ji...@apache.org> on 2018/11/09 09:09:00 UTC

[jira] [Comment Edited] (SPARK-12216) Spark failed to delete temp directory

    [ https://issues.apache.org/jira/browse/SPARK-12216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16681103#comment-16681103 ] 

Deej edited comment on SPARK-12216 at 11/9/18 9:08 AM:
-------------------------------------------------------

This issue has *NOT* been fixed, so marking it as Resolved is plain silly. Moreover, suggesting users to switch to other OSes is not only reckless but also regressive when there is a large community of users attempting to adopt Spark as one of their large scale data processing tools. So please stop with the condescension and work on fixing this bug as the community has been expecting for a long while now.

 

As others have reported, I am able to successfully launch spark-shell and perform basic tasks (including sc.stop()) successfully. However, the moment I try to quit the repl session, it craps out immediately. Also, I am able to manually delete the said temp files/folders Spark creates in the temp directory so there are no permissions issues. Even executing these commands from a command prompt running as Administrator results in the same error, reinforcing the assumption that this is not related to permissions on the temp folder at all.

Here is my set-up to reproduce this issue:-

OS: Windows 10

Spark: version 2.3.2

 /_/
 Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_171)
  
 Stack trace:
 ===========================
 scala> sc
 res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@41167ded
 scala> sc.stop()
 scala> :quit
 2018-11-09 00:10:42 ERROR ShutdownHookManager:91 - Exception while deleting Spark temp dir: C:\Users\user1\AppData\Local\Temp\spark-b155db59-b7c5-4f64-8cfb-00d8f95ea348\repl-fed61a6e-3a1e-46cf-90e9-3fbfcb8a1d87
 java.io.IOException: Failed to delete: C:\Users\{color:#333333}user1\AppData\Local\Temp\spark-b155db59-b7c5-4f64-8cfb-00d8f95ea348\repl-fed61a6e-3a1e-46cf-90e9-3fbfcb8a1d87
         at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1074)
         at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
         at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
         at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
         at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
         at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
         at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
         at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
 ================================================


was (Author: laal):
This issue has *NOT* been fixed, so marking it as Resolved is plain silly. Moreover, suggesting users to switch to other OSes is not only reckless but also regressive when there is a large community of users attempting to adopt Spark as one of their large scale data processing tools. So please stop with the condescension and work on fixing this bug as the community has been expecting for a long while now.

 

As others have reported, I am able to successfully launch spark-shell and perform basic tasks (including sc.stop()) successfully. However, the moment I try to quit the repl session, it craps out immediately.

Here is my set-up to reproduce this issue:-

OS: Windows 10

Spark: version 2.3.2

 /_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_171)
 
Stack trace:
===========================
scala> sc
res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@41167ded
scala> sc.stop()
scala> :quit
2018-11-09 00:10:42 ERROR ShutdownHookManager:91 - Exception while deleting Spark temp dir: C:\Users\user1\AppData\Local\Temp\spark-b155db59-b7c5-4f64-8cfb-00d8f95ea348\repl-fed61a6e-3a1e-46cf-90e9-3fbfcb8a1d87
java.io.IOException: Failed to delete: C:\Users\{color:#333333}user1{color}\AppData\Local\Temp\spark-b155db59-b7c5-4f64-8cfb-00d8f95ea348\repl-fed61a6e-3a1e-46cf-90e9-3fbfcb8a1d87
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1074)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
================================================

> Spark failed to delete temp directory 
> --------------------------------------
>
>                 Key: SPARK-12216
>                 URL: https://issues.apache.org/jira/browse/SPARK-12216
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>         Environment: windows 7 64 bit
> Spark 1.52
> Java 1.8.0.65
> PATH includes:
> C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6\bin
> C:\ProgramData\Oracle\Java\javapath
> C:\Users\Stefan\scala\bin
> SYSTEM variables set are:
> JAVA_HOME=C:\Program Files\Java\jre1.8.0_65
> HADOOP_HOME=C:\Users\Stefan\hadoop-2.6.0\bin
> (where the bin\winutils resides)
> both \tmp and \tmp\hive have permissions
> drwxrwxrwx as detected by winutils ls
>            Reporter: stefan
>            Priority: Minor
>
> The mailing list archives have no obvious solution to this:
> scala> :q
> Stopping spark context.
> 15/12/08 16:24:22 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
> java.io.IOException: Failed to delete: C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
>         at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)
>         at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:63)
>         at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:60)
>         at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
>         at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:60)
>         at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:264)
>         at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:234)
>         at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
>         at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
>         at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>         at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:234)
>         at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
>         at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
>         at scala.util.Try$.apply(Try.scala:161)
>         at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:234)
>         at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:216)
>         at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org