You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "dimtiris kanoute (Jira)" <ji...@apache.org> on 2022/06/24 13:50:00 UTC

[jira] [Created] (SPARK-39580) Write event logs in a continuous manner

dimtiris kanoute created SPARK-39580:
----------------------------------------

             Summary: Write event logs in a continuous manner
                 Key: SPARK-39580
                 URL: https://issues.apache.org/jira/browse/SPARK-39580
             Project: Spark
          Issue Type: New Feature
          Components: Spark Core
    Affects Versions: 3.3.0
            Reporter: dimtiris kanoute


We are using Spark Thrift Server as a service to run Spark SQL queries.

Currently Spark writes *eventlogs* once the job is done/ gracefully killed.

This means that in case of Spark thrift Server when it runs as one job in case of unexpected error ( i.e. OOM ) we cannot access the logs after, as it doesn't write them due to the unexpected shut down.

We would like to suggest as an improvement the functionality for sparks *to write the event logs to a cloud storage in a continuous manner.*

 



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org