You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/06/05 01:27:00 UTC

[jira] [Resolved] (SPARK-24215) Implement __repr__ and _repr_html_ for dataframes in PySpark

     [ https://issues.apache.org/jira/browse/SPARK-24215?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-24215.
----------------------------------
       Resolution: Fixed
    Fix Version/s: 2.4.0

Issue resolved by pull request 21370
[https://github.com/apache/spark/pull/21370]

> Implement __repr__ and _repr_html_ for dataframes in PySpark
> ------------------------------------------------------------
>
>                 Key: SPARK-24215
>                 URL: https://issues.apache.org/jira/browse/SPARK-24215
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark, Spark Core, SQL
>    Affects Versions: 2.3.0
>            Reporter: Ryan Blue
>            Assignee: Li Yuanjian
>            Priority: Major
>             Fix For: 2.4.0
>
>
> To help people that are new to Spark get feedback more easily, we should implement the repr methods for Jupyter python kernels. That way, when users run pyspark in jupyter console or notebooks, they get good feedback about the queries they've defined.
> This should include an option for eager evaluation, (maybe spark.jupyter.eager-eval?). When set, the formatting methods would run dataframes and produce output like {{show}}. This is a good balance between not hiding Spark's action behavior and getting feedback to users that don't know to call actions.
> Here's the dev list thread for context: http://apache-spark-developers-list.1001551.n3.nabble.com/eager-execution-and-debuggability-td23928.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org