You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Aman Omer (Jira)" <ji...@apache.org> on 2019/09/21 13:32:00 UTC

[jira] [Comment Edited] (SPARK-29053) Sort does not work on some columns

    [ https://issues.apache.org/jira/browse/SPARK-29053?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16935041#comment-16935041 ] 

Aman Omer edited comment on SPARK-29053 at 9/21/19 1:31 PM:
------------------------------------------------------------

PR for branch 2.4 [https://github.com/apache/spark/pull/25855]


was (Author: aman_omer):
PR for  branch 2.4 [https://github.com/apache/spark/pull/25855]

> Sort does not work on some columns
> ----------------------------------
>
>                 Key: SPARK-29053
>                 URL: https://issues.apache.org/jira/browse/SPARK-29053
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 2.4.3
>            Reporter: jobit mathew
>            Assignee: Aman Omer
>            Priority: Minor
>             Fix For: 3.0.0
>
>         Attachments: Duration_1.png, ExecutionTime_1.png, Sort Icon.png
>
>
> Spark Thrift JDBC/ODBC Server application UI, *Sorting* is not working for *Duration* and *Execution time* fields.
> *Test Steps*
>  1.Install spark
>  2.Start Spark beeline
>  3.Submit some SQL queries
>  4.Close some spark applications
>  5.Check the Spark Web UI JDBC/ODBC Server TAB.
> *Issue:*
>  *Sorting [ascending or descending]* based on *Duration* and *Execution time* is not proper in *JDBC/ODBC Server TAB*. 
>  Issue there in *Session Statistics* & *SQL Statistics* session tables .Please check it.
> Screenshots are attached.
> !Duration_1.png|width=826,height=410!
> !ExecutionTime_1.png|width=823,height=407!
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org