You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by HyukjinKwon <gi...@git.apache.org> on 2017/01/15 14:28:01 UTC

[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/16586

    [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of flaky, newly introduced and missed test failures on Windows

    ## What changes were proposed in this pull request?
    
    **Failed tests**
    
    ```
    org.apache.spark.sql.hive.execution.HiveQuerySuite:
     - transform with SerDe3 *** FAILED ***
     - transform with SerDe4 *** FAILED ***
    ```
    
    ```
    org.apache.spark.sql.hive.execution.HiveDDLSuite:
     - create hive serde table with new syntax *** FAILED ***
     - add/drop partition with location - managed table *** FAILED ***
    ```
    
    **Aborted tests**
    
    ```
    Exception encountered when attempting to run a suite with class name: org.apache.spark.sql.hive.execution.HiveSerDeSuite *** ABORTED *** (157 milliseconds)
       org.apache.spark.sql.AnalysisException: LOAD DATA input path does not exist: C:projectssparksqlhive   argetscala-2.11   est-classesdatafilessales.txt;
    ```
    
    **Flaky tests**
    
    ```
    org.apache.spark.scheduler.SparkListenerSuite:
     - local metrics *** FAILED ***
    ```
    
    ## How was this patch tested?
    
    Manually tested via AppVeyor.
    
    **Failed tests**
    
    ```
    org.apache.spark.sql.hive.execution.HiveQuerySuite:
     - transform with SerDe3 !!! CANCELED !!! (0 milliseconds)
     - transform with SerDe4 !!! CANCELED !!! (0 milliseconds)
    ```
    
    ```
    org.apache.spark.sql.hive.execution.HiveDDLSuite:
     - create hive serde table with new syntax (1 second, 672 milliseconds)
    ```
    
    **Aborted tests**
    
    ```
    spark.sql.hive.execution.HiveSerDeSuite:
     - Read with RegexSerDe (2 seconds, 142 milliseconds)
     - Read and write with LazySimpleSerDe (tab separated) (2 seconds)
     - Read with AvroSerDe (1 second, 47 milliseconds)
     - Read Partitioned with AvroSerDe (1 second, 422 milliseconds)
    ```
    
    **Flaky tests**
    
    ```
    org.apache.spark.scheduler.SparkListenerSuite:
     - local metrics (4 seconds, 562 milliseconds)
    ```

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark set-path-appveyor

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/16586.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #16586
    
----
commit 9ce8846dfc3678649cc041dc762fcc6ed8038527
Author: hyukjinkwon <gu...@gmail.com>
Date:   2017-01-15T14:11:45Z

    Fix flaky, newly introduced and missed test failures on Windows

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    **[Test build #71399 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/71399/testReport)** for PR 16586 at commit [`9ce8846`](https://github.com/apache/spark/commit/9ce8846dfc3678649cc041dc762fcc6ed8038527).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    @srowen, @shivaram and @felixcheung this is the problem I previously reported to three of you before - [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=E7636D1D-41D6-4E36-9E15-B26EBB03B9E1&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/E7636D1D-41D6-4E36-9E15-B26EBB03B9E1) I have jobs in the queue at AppVeyor but it does not start.
    
    I observed that in AFS account at Spark - https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark/history. This can be simply just jobs queued up in other projects or same thing happening. To verify this, I should check if any job is running in any project in Apache via AppVeyor. In case of mine, it is easy to check because I only have single project, spark.
    
    So, any of you (as a committer or PMC) knows if there is any easy way to retreive any list of projects that use AppVeyor in Apache? I am willing to check each project once I get the list and report this to AppVeyor if the same thing happens to AFS account too.
    
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    @HyukjinKwon I have no info on this I'm afraid. From searching the internet for "appveyor apache" I think Thrift and Nifi might be using it too. Are you saying your jobs never run in AppVeyor, or take a long time to schedule? You're welcome to tackle the problem if you can.
    
    what's the status here -- do we still need a positive result from Appveyor before merging this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r96306753
  
    --- Diff: core/src/test/scala/org/apache/spark/scheduler/SparkListenerSuite.scala ---
    @@ -229,7 +229,7 @@ class SparkListenerSuite extends SparkFunSuite with LocalSparkContext with Match
         }
     
         val numSlices = 16
    -    val d = sc.parallelize(0 to 1e3.toInt, numSlices).map(w)
    +    val d = sc.parallelize(0 to 1e4.toInt, numSlices).map(w)
    --- End diff --
    
    While here, feel free to just write "0 to 10000"


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
     Build started: [TESTS] `org.apache.spark.scheduler.SparkListenerSuite` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=68031366-45EE-45B4-867A-40A4D9B1AD07&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/68031366-45EE-45B4-867A-40A4D9B1AD07)
     Build started: [TESTS] `org.apache.spark.sql.hive.execution.HiveQuerySuite` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=E04C4110-1DAC-479C-BC72-20F668E6995C&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/E04C4110-1DAC-479C-BC72-20F668E6995C)
     Build started: [TESTS] `org.apache.spark.sql.hive.execution.AggregationQuerySuite` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=CF024224-C21C-4466-B624-4E3427F89719&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/CF024224-C21C-4466-B624-4E3427F89719)
     Build started: [TESTS] `org.apache.spark.sql.hive.StatisticsSuite` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=F89A5E7F-3E82-434F-8BF1-2543FCDE44B6&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/F89A5E7F-3E82-434F-8BF1-2543FCDE44B6)
     Build started: [TESTS] `org.apache.spark.sql.hive.execution.SQLQuerySuite` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=2452EE72-87D6-4F87-BD1F-0FF1D281C9BD&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/2452EE72-87D6-4F87-BD1F-0FF1D281C9BD)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=DC2F6B6C-C81D-422D-BF49-F9E8E8C30D35&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/DC2F6B6C-C81D-422D-BF49-F9E8E8C30D35)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=34C6743B-53BE-4554-AA74-0DEADB9514E6&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/34C6743B-53BE-4554-AA74-0DEADB9514E6)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=383BE26D-94C7-4647-96ED-7C9F40205315&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/383BE26D-94C7-4647-96ED-7C9F40205315)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=7F64F197-A80E-4FF6-AEA3-3DB887CC5FE7&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/7F64F197-A80E-4FF6-AEA3-3DB887CC5FE7)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=A0FBE0CC-007A-4DE8-84AA-A6ED6EEE11D5&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/A0FBE0CC-007A-4DE8-84AA-A6ED6EEE11D5)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=E48D9698-5E86-4DAF-A7CE-BE9E64FDD7A1&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/E48D9698-5E86-4DAF-A7CE-BE9E64FDD7A1)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=6988C010-C118-43AC-8B9A-EADAE398B54E&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/6988C010-C118-43AC-8B9A-EADAE398B54E)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=355029B7-760B-4BA9-8DAC-779C8BC2B744&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/355029B7-760B-4BA9-8DAC-779C8BC2B744)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=984252FD-9144-4672-96B1-8187B1343EEA&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/984252FD-9144-4672-96B1-8187B1343EEA)
    
    Build started: [TESTS] `ALL` [![PR-16586](https://ci.appveyor.com/api/projects/status/github/spark-test/spark?branch=4CED883A-7CA0-4571-A6B1-28184D0AD1AF&svg=true)](https://ci.appveyor.com/project/spark-test/spark/branch/4CED883A-7CA0-4571-A6B1-28184D0AD1AF)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r96142347
  
    --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala ---
    @@ -221,8 +221,8 @@ class HiveDDLSuite
             sql(
               s"""
                  |ALTER TABLE $tab ADD
    -             |PARTITION (ds='2008-04-08', hr=11) LOCATION '$part1Path'
    -             |PARTITION (ds='2008-04-08', hr=12) LOCATION '$part2Path'
    +             |PARTITION (ds='2008-04-08', hr=11) LOCATION '${part1Path.toURI}'
    --- End diff --
    
    Just wondering what is the reason?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r96466278
  
    --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveQuerySuite.scala ---
    @@ -461,7 +461,8 @@ class HiveQuerySuite extends HiveComparisonTest with SQLTestUtils with BeforeAnd
           |('serialization.last.column.takes.rest'='true') USING 'cat' AS (tKey, tValue)
           |ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
           |WITH SERDEPROPERTIES ('serialization.last.column.takes.rest'='true') FROM src;
    -    """.stripMargin.replaceAll(System.lineSeparator(), " "))
    +    """.stripMargin.replaceAll(System.lineSeparator(), " "),
    +    skip = !TestUtils.testCommandAvailable("/bin/bash"))
    --- End diff --
    
    what's the cause we need to skip this test? perhaps add a comment to help keep a record of this?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    **[Test build #71595 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/71595/testReport)** for PR 16586 at commit [`6859569`](https://github.com/apache/spark/commit/68595694cf6462414d9c9ca5c4a300f42ca2e3ab).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    **[Test build #71595 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/71595/testReport)** for PR 16586 at commit [`6859569`](https://github.com/apache/spark/commit/68595694cf6462414d9c9ca5c4a300f42ca2e3ab).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r96278101
  
    --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala ---
    @@ -221,8 +221,8 @@ class HiveDDLSuite
             sql(
               s"""
                  |ALTER TABLE $tab ADD
    -             |PARTITION (ds='2008-04-08', hr=11) LOCATION '$part1Path'
    -             |PARTITION (ds='2008-04-08', hr=12) LOCATION '$part2Path'
    +             |PARTITION (ds='2008-04-08', hr=11) LOCATION '${part1Path.toURI}'
    --- End diff --
    
    Thanks! I will keep it in mind. We are not following this rule when writing the test cases.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    (Just FYI, it is now fixed for both mine and AFS account.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Ah, I see. Yes, I am sorry not for clarifying it. Actually, I meant two problems. One is my jobs in my account never run in AppVeyor as above.
    
    And, the other is, separately I suspect that the same thing happens in the current AFS account because I observed that some jobs for SparkR (for other PRs) seems not running for two days and until now. Let me then try to verify it at my best for this.
    
    For the status of this PR, I wanted to show you all a green from AppVeyor to complete fixing the tests on Windows for now (of course I will keep fixing newly introduced ones in the future though). Let me try to show you all a green after contacting AppVeyor for my account. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r96316662
  
    --- Diff: core/src/test/scala/org/apache/spark/scheduler/SparkListenerSuite.scala ---
    @@ -229,7 +229,7 @@ class SparkListenerSuite extends SparkFunSuite with LocalSparkContext with Match
         }
     
         val numSlices = 16
    -    val d = sc.parallelize(0 to 1e3.toInt, numSlices).map(w)
    +    val d = sc.parallelize(0 to 1e4.toInt, numSlices).map(w)
    --- End diff --
    
    Sure, thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [SPARK-19117][SPARK-18922][TESTS] Fix the rest of flaky,...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Hi @srowen, I think it is ready for a second look. In short, the current status is,
    
    - there are some test failures (https://github.com/apache/spark/pull/16586#issuecomment-273437565) when running each package-level, which possibly look flaky
    - these failures were individually tested and passed by `test-only` (https://github.com/apache/spark/pull/16586#issuecomment-273952379)
    - `local metrics` seems still flaky but it seems less flaky in individual tests assuming from the build results in https://github.com/apache/spark/pull/16586#discussion_r97022356


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    **[Test build #71399 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/71399/testReport)** for PR 16586 at commit [`9ce8846`](https://github.com/apache/spark/commit/9ce8846dfc3678649cc041dc762fcc6ed8038527).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    I have no information on this as well. We could file an INFRA ticket if we wanted the information


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    yea, not sure why but appveyor has been stuck from around 3 days ago


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    They all pass in individual tests with `test-only` (please check the logs above).
    
    ```
    org.apache.spark.scheduler.SparkListenerSuite:
     - local metrics (8 seconds, 656 milliseconds)
    
    org.apache.spark.sql.hive.execution.HiveQuerySuite:
     - constant null testing (531 milliseconds)
    
    org.apache.spark.sql.hive.execution.AggregationQuerySuite:
     - udaf with all data types (4 seconds, 285 milliseconds)
    
    org.apache.spark.sql.hive.StatisticsSuite:
     - verify serialized column stats after analyzing columns (2 seconds, 844 milliseconds)
    
    org.apache.spark.sql.hive.execution.SQLQuerySuite:
    - dynamic partition value test (1 second, 407 milliseconds)
    - SPARK-6785: HiveQuerySuite - Date cast (188 milliseconds)
    ```
    
    Although I am wondering how/why those tests seem more flaky (assuming from observations in the builds), I think it is possible to say, at least, Spark tests (in a way I run) are able to pass on Windows.
    
    Let me remove `[WIP]` and try to make the tests stable on Windows in the future if this sounds reasonable.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r96554333
  
    --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveQuerySuite.scala ---
    @@ -461,7 +461,8 @@ class HiveQuerySuite extends HiveComparisonTest with SQLTestUtils with BeforeAnd
           |('serialization.last.column.takes.rest'='true') USING 'cat' AS (tKey, tValue)
           |ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
           |WITH SERDEPROPERTIES ('serialization.last.column.takes.rest'='true') FROM src;
    -    """.stripMargin.replaceAll(System.lineSeparator(), " "))
    +    """.stripMargin.replaceAll(System.lineSeparator(), " "),
    +    skip = !TestUtils.testCommandAvailable("/bin/bash"))
    --- End diff --
    
    Script transformation such as `USING 'cat'` requires [a hard-coded `/bin/bash`](https://github.com/apache/spark/blob/21c7539a5274a7e77686d17a6261d56592b85c2d/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala#L70) which seems missing or differently located on Windows (with/without Cygwin). 
    
    I will add a single de-duplicated comment at the top (there are many instances of it) if I happens to push more commits! 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Ah, thank you @shivaram and @felixcheung 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Current status of this PR:
    
    It seems these tests below constantly failing during 6 times build (please check the logs in https://ci.appveyor.com/project/spark-test/spark/history).
    
    ```
    org.apache.spark.scheduler.SparkListenerSuite:
     - local metrics *** FAILED *** (1 second, 487 milliseconds)
    
    org.apache.spark.sql.hive.execution.HiveQuerySuite:
    - constant null testing *** FAILED *** (562 milliseconds)
    
    org.apache.spark.sql.hive.execution.AggregationQuerySuite
    - udaf with all data types *** FAILED *** (641 milliseconds)
    
    org.apache.spark.sql.hive.StatisticsSuite
    - verify serialized column stats after analyzing columns *** FAILED *** (1 second, 110 milliseconds)
    
    org.apache.spark.sql.hive.execution. SQLQuerySuite
    - dynamic partition value test *** FAILED *** (547 milliseconds)
    - SPARK-6785: HiveQuerySuite - Date cast *** FAILED *** (156 milliseconds)
    ```
    
    Let me try to run individual tests for them because it takes too long time.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/71595/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [SPARK-19117][SPARK-18922][TESTS] Fix the rest of...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/16586


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r97022356
  
    --- Diff: core/src/test/scala/org/apache/spark/scheduler/SparkListenerSuite.scala ---
    @@ -229,7 +229,7 @@ class SparkListenerSuite extends SparkFunSuite with LocalSparkContext with Match
         }
     
         val numSlices = 16
    -    val d = sc.parallelize(0 to 1e3.toInt, numSlices).map(w)
    +    val d = sc.parallelize(0 to 10000, numSlices).map(w)
    --- End diff --
    
    I am pretty sure the deserialization time test is less flaky now assuming from the individual tests as below:
    
    **Before** - 9 failures out of 10.
    
    
    [1 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/546-windows-complete/job/ktdmdxkdi4ni4ier)
    [2 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/549-windows-complete/job/b4mqgyt72g6he7e7)
    [3 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/551-windows-complete/job/j0ywrgv8d733yqb4)
    [4 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/553-windows-complete/job/yqoapee3og5x46wk)
    [5 (passed)](https://ci.appveyor.com/project/spark-test/spark/build/554-windows-complete/job/g3hhdl5s8odu9ir0)
    [6 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/555-windows-complete/job/9utyo2glowuf3ulc)
    [7 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/541-windows-test/job/4gtm26hcm5327aa1)
    [8 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/542-windows-test/job/166i4xiljy7iof8l)
    [9 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/540-windows-test/job/39v7nwuq598p3rtm)
    [10 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/539-windows-test/job/how9cbsj5i5cykeh)
    
    **After** - 1 failure out of 7.
     
    [1 (passed)](https://ci.appveyor.com/project/spark-test/spark/build/576-windows-complete/job/9sfx150cp38ofttn)
    [2 (passed)](https://ci.appveyor.com/project/spark-test/spark/build/577-windows-complete/job/nrjgs7emtlnj6y5f)
    [3 (passed)](https://ci.appveyor.com/project/spark-test/spark/build/578-windows-complete/job/qwgsuc5uas8mk0o7)
    [4 (passed)](https://ci.appveyor.com/project/spark-test/spark/build/579-windows-complete/job/sf1sspisb4ai4j7r)
    [5 (failed)](https://ci.appveyor.com/project/spark-test/spark/build/580-windows-complete/job/808c08fvnm26w3uh)
    [6 (passed)](https://ci.appveyor.com/project/spark-test/spark/build/581-windows-complete/job/y7o97qq18my44dvo)
    [7 (passed)](https://ci.appveyor.com/project/spark-test/spark/branch/68031366-45EE-45B4-867A-40A4D9B1AD07)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Hm.. it seems the builds are locked again.. I think I should contact AppVeyor again if it does not proceed further tomorrow.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the rest of f...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/71399/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16586: [SPARK-19117][SPARK-18922][TESTS] Fix the rest of flaky,...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/16586
  
    Merged to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16586: [WIP][SPARK-19117][SPARK-18922][TESTS] Fix the re...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16586#discussion_r96149617
  
    --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala ---
    @@ -221,8 +221,8 @@ class HiveDDLSuite
             sql(
               s"""
                  |ALTER TABLE $tab ADD
    -             |PARTITION (ds='2008-04-08', hr=11) LOCATION '$part1Path'
    -             |PARTITION (ds='2008-04-08', hr=12) LOCATION '$part2Path'
    +             |PARTITION (ds='2008-04-08', hr=11) LOCATION '${part1Path.toURI}'
    --- End diff --
    
    It seems due to the parser. If the path is something like `C:\tmp\b\c`, then, it goes like `C:    mpbc` (escaping). To deal with this, we should make it `C:\\tmp\\b\\c` or URI. The simplest choice seems to use URI unless it is dedicated test to such case.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org