You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jobit mathew (Jira)" <ji...@apache.org> on 2019/09/11 09:33:00 UTC

[jira] [Created] (SPARK-29051) Spark Application UI search is not working for some fields

jobit mathew created SPARK-29051:
------------------------------------

             Summary: Spark Application UI search is not working for some fields
                 Key: SPARK-29051
                 URL: https://issues.apache.org/jira/browse/SPARK-29051
             Project: Spark
          Issue Type: Bug
          Components: Web UI
    Affects Versions: 2.4.4, 2.4.3
            Reporter: jobit mathew


Spark Application UI *Search is not working* for some fields in *Spark Web UI Executors TAB* and Spark job History Server page

*Test Steps*
 1.Install spark
 2.Start Spark SQL/Shell/beeline
 3.Submit some SQL queries 
 4.Close some spark applications
 5.Check the Spark Web UI Executors TAB and verify search
 6.Check Spark job History Server page and verify search

*Issue 1*

Searching of some field contents are not working in *Spark Web UI Executors TAB*(Spark SQL/Shell/JDBC server UIs ).

• *Input column*(search working wrongly .Example if input is 34.5KB,searching of 34.5 won't take ,but 345 shows the search result -it is wrong)
 • Task time search is Ok, but *GC time* search not working
 • *Thread Dump* -search not working [have to confirm it is required to add in search, but we are able to search stdout text in that case Thread Dump text also should be searchable ]
 • *Storage memory* example 384.1 search not searching.

*Issue 2:*

*Spark job History Server page*,completed tasks- search is not working based on *Duration column values*. We are getting the proper search result, if we search the content from any other columns except Duration.*For example if Duration is 6.1 min* we can not search result for 6.1 min or even 6.1.

 



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org