You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Alexander Tronchin-James (JIRA)" <ji...@apache.org> on 2019/07/17 21:53:00 UTC
[jira] [Commented] (SPARK-18829) Printing to logger
[ https://issues.apache.org/jira/browse/SPARK-18829?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16887450#comment-16887450 ]
Alexander Tronchin-James commented on SPARK-18829:
--------------------------------------------------
FWIW, the showString method on datasets is private, so it doesn't seem possible to call except by internal Dataset methods.
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala#L295
> Printing to logger
> ------------------
>
> Key: SPARK-18829
> URL: https://issues.apache.org/jira/browse/SPARK-18829
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 1.6.2
> Environment: ALL
> Reporter: David Hodeffi
> Priority: Trivial
> Labels: easyfix, patch
> Original Estimate: 1h
> Remaining Estimate: 1h
>
> I would like to print dataframe.show or df.explain(true) into log file.
> right now the code print to standard output without a way to redirect it.
> It also cannot be configured on log4j.properties.
> My suggestion is to write to the logger and standard output.
> i.e
> class DataFrame {......
> override def explain(extended: Boolean): Unit = {
> val explain = ExplainCommand(queryExecution.logical, extended = extended)
> sqlContext.executePlan(explain).executedPlan.executeCollect().foreach {
> // scalastyle:off println
> r => {
> println(r.getString(0))
> logger.debug(r.getString(0))
> }
> }
> // scalastyle:on println
> }
> }
> def show(numRows: Int, truncate: Boolean): Unit = {
> val str =showString(numRows, truncate)
> println(str)
> logger.debug(str)
> }
> }
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org