You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wei Liu (Jira)" <ji...@apache.org> on 2024/01/08 22:00:00 UTC

[jira] [Updated] (SPARK-46627) Streaming UI hover-over shows incorrect value

     [ https://issues.apache.org/jira/browse/SPARK-46627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wei Liu updated SPARK-46627:
----------------------------
    Description: 
Running a simple streaming query:

val df = spark.readStream.format("rate").option("rowsPerSecond", "50000000").load()

val q = df.writeStream.format("noop").start()

 

The hover-over value is incorrect in the streaming ui (shows 321.00 at undefined)

 
!Screenshot 2024-01-08 at 1.55.57 PM.png!

  was:
Running a simple streaming query:

val df = spark.readStream.format("rate").option("rowsPerSecond", "50000000").load()

val q = df.writeStream.format("noop").start()

 

The hover-over value is incorrect in the streaming ui:

 
!https://files.slack.com/files-tmb/T02727P8HV4-F06CJ83D3JT-b44210f391/image_720.png!


> Streaming UI hover-over shows incorrect value
> ---------------------------------------------
>
>                 Key: SPARK-46627
>                 URL: https://issues.apache.org/jira/browse/SPARK-46627
>             Project: Spark
>          Issue Type: Task
>          Components: Structured Streaming, UI, Web UI
>    Affects Versions: 4.0.0
>            Reporter: Wei Liu
>            Priority: Major
>         Attachments: Screenshot 2024-01-08 at 1.55.57 PM.png
>
>
> Running a simple streaming query:
> val df = spark.readStream.format("rate").option("rowsPerSecond", "50000000").load()
> val q = df.writeStream.format("noop").start()
>  
> The hover-over value is incorrect in the streaming ui (shows 321.00 at undefined)
>  
> !Screenshot 2024-01-08 at 1.55.57 PM.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org