You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/12/15 07:58:58 UTC

[jira] [Commented] (SPARK-18878) Fix/investigate the more identified test failures in Java/Scala on Windows

    [ https://issues.apache.org/jira/browse/SPARK-18878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15750709#comment-15750709 ] 

Hyukjin Kwon commented on SPARK-18878:
--------------------------------------

cc [~srowen] Currently, the test failures were not able to be identified all due to the time limitation (one hour) in AppVeyor. This was increased for my account after manually asking - https://github.com/appveyor/ci/issues/517 but it seems it can't be increased that much. (It is increased up to one hour and 30 minutes).

I will specify the errors within each child task after manually testing separately when it is required.

> Fix/investigate the more identified test failures in Java/Scala on Windows
> --------------------------------------------------------------------------
>
>                 Key: SPARK-18878
>                 URL: https://issues.apache.org/jira/browse/SPARK-18878
>             Project: Spark
>          Issue Type: Test
>          Components: Tests
>            Reporter: Hyukjin Kwon
>
> It seems many tests are being failed on Windows. Some are only related with tests whereas others are related with the functionalities themselves which causes actual failures for some APIs on Windows.
> The tests were hanging due to some issues in SPARK-17591 and SPARK-18785 and now apparently we could proceed much further (apparently it seems we might reach the end).
> The tests proceeded via AppVeyor - https://ci.appveyor.com/project/spark-test/spark/build/259-spark-test-windows



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org