You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jiaan.geng (Jira)" <ji...@apache.org> on 2023/01/05 08:16:00 UTC
[jira] [Commented] (SPARK-41824) Implement DataFrame.explain format to be similar to PySpark
[ https://issues.apache.org/jira/browse/SPARK-41824?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17654852#comment-17654852 ]
jiaan.geng commented on SPARK-41824:
------------------------------------
I will take a look!
> Implement DataFrame.explain format to be similar to PySpark
> -----------------------------------------------------------
>
> Key: SPARK-41824
> URL: https://issues.apache.org/jira/browse/SPARK-41824
> Project: Spark
> Issue Type: Sub-task
> Components: Connect
> Affects Versions: 3.4.0
> Reporter: Sandeep Singh
> Priority: Major
>
> {code:java}
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", line 1296, in pyspark.sql.connect.dataframe.DataFrame.explain
> Failed example:
> df.explain()
> Expected:
> == Physical Plan ==
> *(1) Scan ExistingRDD[age...,name...]
> Got:
> == Physical Plan ==
> LocalTableScan [age#1148L, name#1149]
> <BLANKLINE>
> <BLANKLINE>
> **********************************************************************
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", line 1314, in pyspark.sql.connect.dataframe.DataFrame.explain
> Failed example:
> df.explain(mode="formatted")
> Expected:
> == Physical Plan ==
> * Scan ExistingRDD (...)
> (1) Scan ExistingRDD [codegen id : ...]
> Output [2]: [age..., name...]
> ...
> Got:
> == Physical Plan ==
> LocalTableScan (1)
> <BLANKLINE>
> <BLANKLINE>
> (1) LocalTableScan
> Output [2]: [age#1170L, name#1171]
> Arguments: [age#1170L, name#1171]
> <BLANKLINE>
> <BLANKLINE>{code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org