You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2016/09/03 20:58:20 UTC

[jira] [Resolved] (SPARK-16829) sparkR sc.setLogLevel doesn't work

     [ https://issues.apache.org/jira/browse/SPARK-16829?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Shivaram Venkataraman resolved SPARK-16829.
-------------------------------------------
       Resolution: Fixed
    Fix Version/s: 2.1.0

Issue resolved by pull request 14433
[https://github.com/apache/spark/pull/14433]

> sparkR sc.setLogLevel doesn't work
> ----------------------------------
>
>                 Key: SPARK-16829
>                 URL: https://issues.apache.org/jira/browse/SPARK-16829
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Miao Wang
>             Fix For: 2.1.0
>
>
> ./bin/sparkR
> Launching java with spark-submit command /Users/mwang/spark_ws_0904/bin/spark-submit   "sparkr-shell" /var/folders/s_/83b0sgvj2kl2kwq4stvft_pm0000gn/T//RtmpQxJGiZ/backend_porte9474603ed1e 
> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel).
> > sc.setLogLevel("INFO")
> Error: could not find function "sc.setLogLevel"
> sc.setLogLevel doesn't exist.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org