You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/09/19 04:23:21 UTC

[jira] [Updated] (SPARK-17591) Fix/investigate the failure of tests in Scala On Windows

     [ https://issues.apache.org/jira/browse/SPARK-17591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-17591:
---------------------------------
    Description: 
{code}
Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 17.53 sec <<< FAILURE! - in org.apache.spark.JavaAPISuite
wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.313 sec  <<< FAILURE!
java.lang.AssertionError: 
expected:<spark is easy to use.
> but was:<null>
	at org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1089)
{code}

{code}
Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec <<< FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time elapsed: 0.047 sec  <<< FAILURE!
java.lang.AssertionError: expected:<0> but was:<1>
	at org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
{code}

{code}
Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed: 3.418 sec  <<< ERROR!
java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
Running org.apache.spark.streaming.JavaDurationSuite
{code}

{code}
Running org.apache.spark.streaming.JavaAPISuite
Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed: 3.418 sec  <<< ERROR!
java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
{code}

{code}
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.016 sec - in org.apache.spark.streaming.JavaWriteAheadLogSuite
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Results :
Tests in error: 
  JavaAPISuite.testCheckpointMasterRecovery:1808 � IO Failed to delete: C:\proje...
{code}

The tests were aborted for unknown reason during SQL tests - {{BroadcastJoinSuite}} emitting the exceptions below continuously:

{code}
20:48:09.876 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error running executor
java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" (in directory "C:\projects\spark\work\app-20160918204809-0000\0"): CreateProcess error=206, The filename or extension is too long
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
	at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
	at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
Caused by: java.io.IOException: CreateProcess error=206, The filename or extension is too long
	at java.lang.ProcessImpl.create(Native Method)
	at java.lang.ProcessImpl.<init>(ProcessImpl.java:386)
	at java.lang.ProcessImpl.start(ProcessImpl.java:137)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
	... 2 more
{code}

Here is the full log for the test - https://ci.appveyor.com/project/spark-test/spark/build/15-scala-tests

We may have to create sub-tasks if these are actual issues on Windows rather than just mistakes in tests.

I am willing to test this again after fixing some issues here in particular the last one.

I trigger the build by the comments below:

{code}
mvn -DskipTests -Phadoop-2.6 -Phive -Phive-thriftserver package
mvn -Phadoop-2.6 -Phive -Phive-thriftserver --fail-never test
{code}

  was:
{code}
Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 17.53 sec <<< FAILURE! - in org.apache.spark.JavaAPISuite
wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.313 sec  <<< FAILURE!
java.lang.AssertionError: 
expected:<spark is easy to use.
> but was:<null>
	at org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1089)
{code}

{code}
Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec <<< FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time elapsed: 0.047 sec  <<< FAILURE!
java.lang.AssertionError: expected:<0> but was:<1>
	at org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
{code}

{code}
Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed: 3.418 sec  <<< ERROR!
java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
Running org.apache.spark.streaming.JavaDurationSuite
{code}

{code}
Running org.apache.spark.streaming.JavaAPISuite
Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed: 3.418 sec  <<< ERROR!
java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
{code}

{code}
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.016 sec - in org.apache.spark.streaming.JavaWriteAheadLogSuite
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Results :
Tests in error: 
  JavaAPISuite.testCheckpointMasterRecovery:1808 � IO Failed to delete: C:\proje...
{code}

The tests were aborted for unknown reason during SQL tests - {{BroadcastJoinSuite}} emitting the exceptions below continuously:

{code}
20:48:09.876 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error running executor
java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" (in directory "C:\projects\spark\work\app-20160918204809-0000\0"): CreateProcess error=206, The filename or extension is too long
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
	at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
	at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
Caused by: java.io.IOException: CreateProcess error=206, The filename or extension is too long
	at java.lang.ProcessImpl.create(Native Method)
	at java.lang.ProcessImpl.<init>(ProcessImpl.java:386)
	at java.lang.ProcessImpl.start(ProcessImpl.java:137)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
	... 2 more
{code}

Here is the full log for the test - https://ci.appveyor.com/project/spark-test/spark/build/15-scala-tests

We may have to create sub-tasks if these are actual issues on Windows rather than just mistakes in tests.

I am willing to test this again after fixing some issues here in particular the last one.


> Fix/investigate the failure of tests in Scala On Windows
> --------------------------------------------------------
>
>                 Key: SPARK-17591
>                 URL: https://issues.apache.org/jira/browse/SPARK-17591
>             Project: Spark
>          Issue Type: Test
>          Components: Build, Spark Core, SQL, Streaming
>            Reporter: Hyukjin Kwon
>
> {code}
> Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 17.53 sec <<< FAILURE! - in org.apache.spark.JavaAPISuite
> wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.313 sec  <<< FAILURE!
> java.lang.AssertionError: 
> expected:<spark is easy to use.
> > but was:<null>
> 	at org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1089)
> {code}
> {code}
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec <<< FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
> testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time elapsed: 0.047 sec  <<< FAILURE!
> java.lang.AssertionError: expected:<0> but was:<1>
> 	at org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
> {code}
> {code}
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
> 	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> Running org.apache.spark.streaming.JavaDurationSuite
> {code}
> {code}
> Running org.apache.spark.streaming.JavaAPISuite
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<< FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed: 3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
> 	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> {code}
> {code}
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.016 sec - in org.apache.spark.streaming.JavaWriteAheadLogSuite
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
> Results :
> Tests in error: 
>   JavaAPISuite.testCheckpointMasterRecovery:1808 � IO Failed to delete: C:\proje...
> {code}
> The tests were aborted for unknown reason during SQL tests - {{BroadcastJoinSuite}} emitting the exceptions below continuously:
> {code}
> 20:48:09.876 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error running executor
> java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" (in directory "C:\projects\spark\work\app-20160918204809-0000\0"): CreateProcess error=206, The filename or extension is too long
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
> 	at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
> 	at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
> Caused by: java.io.IOException: CreateProcess error=206, The filename or extension is too long
> 	at java.lang.ProcessImpl.create(Native Method)
> 	at java.lang.ProcessImpl.<init>(ProcessImpl.java:386)
> 	at java.lang.ProcessImpl.start(ProcessImpl.java:137)
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
> 	... 2 more
> {code}
> Here is the full log for the test - https://ci.appveyor.com/project/spark-test/spark/build/15-scala-tests
> We may have to create sub-tasks if these are actual issues on Windows rather than just mistakes in tests.
> I am willing to test this again after fixing some issues here in particular the last one.
> I trigger the build by the comments below:
> {code}
> mvn -DskipTests -Phadoop-2.6 -Phive -Phive-thriftserver package
> mvn -Phadoop-2.6 -Phive -Phive-thriftserver --fail-never test
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org