You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "msetodev (via GitHub)" <gi...@apache.org> on 2023/11/13 19:27:11 UTC

Re: [PR] [SPARK-39489][CORE] Improve event logging JsonProtocol performance by using Jackson instead of Json4s [spark]

msetodev commented on code in PR #36885:
URL: https://github.com/apache/spark/pull/36885#discussion_r1391547071


##########
core/src/test/scala/org/apache/spark/deploy/history/EventLogTestHelper.scala:
##########


Review Comment:
   This change sort of 'broke' the open source library for sending Azure Databricks logs to Azure Log Analytics. Here is the snippet of code from UnifiedSparkListener.scala -
   
   
   `protected def logSparkListenerEvent(
                                          event: SparkListenerEvent,
                                          getTimestamp: () => Instant =
                                          () => Instant.now()): Unit = {
       val json = try {
         // Add a well-known time field.
         Some(
           JsonProtocol.sparkEventToJson(event)
             .merge(render(
               SparkInformation.get() + ("SparkEventTime" -> getTimestamp().toString)
             ))
         )
       } catch {
         case NonFatal(e) =>
           logError(s"Error serializing SparkListenerEvent to JSON: $event", e)
           None
       }
       sendToSink(json)
     }`
   
   
   Now when trying to compile the Maven project, it errors out because sparkEventToJson no longer exists.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org