You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/12 14:24:39 UTC
[jira] [Resolved] (SPARK-12582) IndexShuffleBlockResolverSuite
fails in windows
[ https://issues.apache.org/jira/browse/SPARK-12582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-12582.
-------------------------------
Resolution: Fixed
Fix Version/s: 1.6.1
2.0.0
Issue resolved by pull request 10526
[https://github.com/apache/spark/pull/10526]
> IndexShuffleBlockResolverSuite fails in windows
> -----------------------------------------------
>
> Key: SPARK-12582
> URL: https://issues.apache.org/jira/browse/SPARK-12582
> Project: Spark
> Issue Type: Bug
> Components: Tests, Windows
> Reporter: yucai
> Assignee: yucai
> Fix For: 2.0.0, 1.6.1
>
>
> IndexShuffleBlockResolverSuite fails in my windows develop machine.
> {code}
> [info] IndexShuffleBlockResolverSuite:
> [info] - commit shuffle files multiple times *** FAILED *** (388 milliseconds)
> [info] Array(10, 0, 20) equaled Array(10, 0, 20) (IndexShuffleBlockResolverSuite.scala:108)
> [info] org.scalatest.exceptions.TestFailedException:
> .....
> .....
> [info] Exception encountered when attempting to run a suite with class name: org.apache.spark.shuffle.sort.IndexShuffleB
> lockResolverSuite *** ABORTED *** (2 seconds, 234 milliseconds)
> [info] java.io.IOException: Failed to delete: C:\Users\yyu29\Documents\codes.next\spark\target\tmp\spark-0e81a15a-e712
> -4b1c-a089-f421db149e65
> [info] at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:940)
> [info] at org.apache.spark.shuffle.sort.IndexShuffleBlockResolverSuite.afterEach(IndexShuffleBlockResolverSuite.scala:
> 60)
> [info] at org.scalatest.BeforeAndAfterEach$class.afterEach(BeforeAndAfterEach.scala:205)
> [info] at org.apache.spark.shuffle.sort.IndexShuffleBlockResolverSuite.afterEach(IndexShuffleBlockResolverSuite.scala:
> 36)
> [info] at org.scalatest.BeforeAndAfterEach$class.afterEach(BeforeAndAfterEach.scala:220)
> [info] at org.apache.spark.shuffle.sort.IndexShuffleBlockResolverSuite.afterEach(IndexShuffleBlockResolverSuite.scala:
> 36)
> {code}
> Root cause is when "afterEach" wants to clean up data, some files are still open. For example:
> {code}
> // The dataFile should be the previous one
> val in = new FileInputStream(dataFile)
> val firstByte = new Array[Byte](1)
> in.read(firstByte)
> assert(firstByte(0) === 0)
> {code}
> Lack of "in.close()".
> In Linux, it is not a problem, you can still delete a file even it is open, but this does not work in windows, which will report "resource is busy".
> Another issue is this IndexShuffleBlockResolverSuite.scala is a scala file but it is placed in "test/java".
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org