You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by felixcheung <gi...@git.apache.org> on 2017/01/22 02:19:38 UTC

[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

GitHub user felixcheung opened a pull request:

    https://github.com/apache/spark/pull/16670

    [SPARK-19324][SPARKR] Spark VJM stdout output is getting dropped in SparkR

    ## What changes were proposed in this pull request?
    
    This affects mostly running job from the driver in client mode when results are expected to be through stdout (which should be somewhat rare, but possible)
    
    Before:
    ```
    > a <- as.DataFrame(cars)
    > b <- group_by(a, "dist")
    > c <- count(b)
    > sparkR.callJMethod(c$count@jc, "explain", TRUE)
    NULL
    ```
    
    After:
    ```
    > a <- as.DataFrame(cars)
    > b <- group_by(a, "dist")
    > c <- count(b)
    > sparkR.callJMethod(c$count@jc, "explain", TRUE)
    count#11L
    NULL
    ```
    
    Now, `column.explain()` doesn't seem very useful (we can get more extensive output with `DataFrame.explain()`) but there are other more complex example with calls of `println` in Scala/JVM side.
    
    
    ## How was this patch tested?
    
    manual


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/felixcheung/spark rjvmstdout

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/16670.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #16670
    
----
commit 294ce991d2e1c8d7a38b526ccf4f35a7ac41fbc1
Author: Felix Cheung <fe...@hotmail.com>
Date:   2017-01-22T02:14:06Z

    do not drop stdout

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r98267056
  
    --- Diff: R/pkg/R/utils.R ---
    @@ -756,12 +756,12 @@ varargsToJProperties <- function(...) {
       props
     }
     
    -launchScript <- function(script, combinedArgs, capture = FALSE) {
    +launchScript <- function(script, combinedArgs, wait = FALSE) {
       if (.Platform$OS.type == "windows") {
         scriptWithArgs <- paste(script, combinedArgs, sep = " ")
    -    shell(scriptWithArgs, translate = TRUE, wait = capture, intern = capture) # nolint
    +    shell(scriptWithArgs, translate = TRUE, wait = wait, intern = wait) # nolint
       } else {
    -    system2(script, combinedArgs, wait = capture, stdout = capture)
    +    system2(script, combinedArgs, wait = wait)
    --- End diff --
    
    Minor - can we explicitly pass `stdout = ""` and put this comment on top of it ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97211236
  
    --- Diff: R/pkg/R/utils.R ---
    @@ -756,12 +756,12 @@ varargsToJProperties <- function(...) {
       props
     }
     
    -launchScript <- function(script, combinedArgs, capture = FALSE) {
    +launchScript <- function(script, combinedArgs, wait = FALSE) {
       if (.Platform$OS.type == "windows") {
         scriptWithArgs <- paste(script, combinedArgs, sep = " ")
    -    shell(scriptWithArgs, translate = TRUE, wait = capture, intern = capture) # nolint
    +    shell(scriptWithArgs, translate = TRUE, wait = wait, intern = wait) # nolint
       } else {
    -    system2(script, combinedArgs, wait = capture, stdout = capture)
    +    system2(script, combinedArgs, wait = wait)
    --- End diff --
    
    http://www.astrostatistics.psu.edu/datasets/R/html/base/html/shell.html
    on Windows, intern = F seems to mean output to the console.
    (doc page is missing on stat.ethz.ch)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/72084/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97393450
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    From what I observed running SparkR as a package, I'm not sure we should pipe/redirect the stdout always - it could get very noisy running SparkR from an IDE. On the other hand, often times the result (error) is not enough to debug the issue.
    
    I'd propose we don't redirect stdout by default in keeping the IDE experience cleaner, however we should have an API to "turn this on and off" programmatically on demand. Although it is not clear `system2` supports that though, stdout is either TRUE (capture to return as a character vector), NULL/FALSE (drop), "" (to the console *of that child process*), "name" (file name of the stdout getting logged into)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97939920
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    This is an example of what in a R IDE see:
    ```
    > head(p, 40)
    Error in handleErrors(returnStatus, conn) :
      org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 70.0 failed 1 times, most recent failure: Lost task 0.0 in stage 70.0 (TID 115, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function($anonfun$4: (string) => double)
           at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
           at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
           at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
           at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
           at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
           at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:826)
           at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:826)
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97211194
  
    --- Diff: R/pkg/R/utils.R ---
    @@ -756,12 +756,12 @@ varargsToJProperties <- function(...) {
       props
     }
     
    -launchScript <- function(script, combinedArgs, capture = FALSE) {
    +launchScript <- function(script, combinedArgs, wait = FALSE) {
       if (.Platform$OS.type == "windows") {
         scriptWithArgs <- paste(script, combinedArgs, sep = " ")
    -    shell(scriptWithArgs, translate = TRUE, wait = capture, intern = capture) # nolint
    +    shell(scriptWithArgs, translate = TRUE, wait = wait, intern = wait) # nolint
       } else {
    -    system2(script, combinedArgs, wait = capture, stdout = capture)
    +    system2(script, combinedArgs, wait = wait)
    --- End diff --
    
    http://stat.ethz.ch/R-manual/R-devel/library/base/html/system2.html
    stdout = F means "discard output"
    stdout = "" (default) means to the console


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    **[Test build #72084 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/72084/testReport)** for PR 16670 at commit [`322b760`](https://github.com/apache/spark/commit/322b7600838f20ea48a7108e6e00ffcb51e1818e).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97229573
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    I see - I expected the stdout = "" to be piping it to the R process stdout. We could also explicitly pass a `fd` to do this pipe ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    **[Test build #71790 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/71790/testReport)** for PR 16670 at commit [`294ce99`](https://github.com/apache/spark/commit/294ce991d2e1c8d7a38b526ccf4f35a7ac41fbc1).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    **[Test build #71790 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/71790/testReport)** for PR 16670 at commit [`294ce99`](https://github.com/apache/spark/commit/294ce991d2e1c8d7a38b526ccf4f35a7ac41fbc1).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r98113719
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    from https://github.com/apache/spark/pull/16670#discussion_r97393450 this is "often times the result (error) is not enough to debug the issue."
    
    our options are:
    - redirect stdout always (could be very noisy)
    - do not redirect stdout by default, but also do not drop - if R shell share the console with JVM (aka sparkR shell) then user would see messages, but if IDE, user would not see
      - I'd further propose an API to turn on redirect on demand which would address the R IDE case


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/16670


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97214373
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    we could, but unfortunately, we don't actually call launchScript with wait/capture = TRUE
    we call wait/capture = FALSE and expect to let console/stdout to leak through, and return NULL.
    
    I'll try to add test for that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97214965
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    Hmm, I've tried, I don't think it would work.
    When calling `system2(.., wait = FALSE, capture = "")` the output to stdout is actually from the child process, so I don't think we would be able to see it from the R process.
    We could redirect it, but then it would be the same as `system2(..., wait = FALSE, capture = TRUE)` but again it wouldn't be what we are normally calling.
    
    I think we would need to dig deeper on this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r97212748
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    Can we add a similar test with something getting printed on `stdout` from the JVM ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/71790/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r98092733
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    Is this what they see before this change or after it ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is ...

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16670#discussion_r98264755
  
    --- Diff: R/pkg/inst/tests/testthat/test_Windows.R ---
    @@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
       if (.Platform$OS.type != "windows") {
         skip("This test is only for Windows, skipped")
       }
    -  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
    +  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
    --- End diff --
    
    I think the second option is fine - i.e not redirecting it by default but happening to share stdout with the R process. That way if say the R IDE has some way to save or view logs from the stdout of R, then users can use that (Does RStudio have something like this ?)
    
    The API to redirect on demand might be useful (it'll be something like setLogLevel ?) but I'm not sure we can change it for an already running JVM ?
    
    Anyways let me review this PR one more time, I think we  can discuss the new API in a separate JIRA


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    **[Test build #72084 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/72084/testReport)** for PR 16670 at commit [`322b760`](https://github.com/apache/spark/commit/322b7600838f20ea48a7108e6e00ffcb51e1818e).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16670: [SPARK-19324][SPARKR] Spark VJM stdout output is getting...

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/16670
  
    LGTM. Merging this to master, branch-2.1


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org