You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/02/03 13:21:00 UTC

[jira] [Resolved] (SPARK-26818) Make MLEvents JSON ser/de safe

     [ https://issues.apache.org/jira/browse/SPARK-26818?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-26818.
----------------------------------
       Resolution: Fixed
    Fix Version/s: 3.0.0

Issue resolved by pull request 23728
[https://github.com/apache/spark/pull/23728]

> Make MLEvents JSON ser/de safe
> ------------------------------
>
>                 Key: SPARK-26818
>                 URL: https://issues.apache.org/jira/browse/SPARK-26818
>             Project: Spark
>          Issue Type: Bug
>          Components: ML
>    Affects Versions: 3.0.0
>            Reporter: Hyukjin Kwon
>            Assignee: Hyukjin Kwon
>            Priority: Major
>             Fix For: 3.0.0
>
>
> Looks ML events are not JSON serializable. We can make it serialisable like:
> {code}
> @DeveloperApi
> case class SparkListenerSQLExecutionEnd(executionId: Long, time: Long)
>   extends SparkListenerEvent {
>   // The name of the execution, e.g. `df.collect` will trigger a SQL execution with name "collect".
>   @JsonIgnore private[sql] var executionName: Option[String] = None
>   // The following 3 fields are only accessed when `executionName` is defined.
>   // The duration of the SQL execution, in nanoseconds.
>   @JsonIgnore private[sql] var duration: Long = 0L
>   // The `QueryExecution` instance that represents the SQL execution
>   @JsonIgnore private[sql] var qe: QueryExecution = null
>   // The exception object that caused this execution to fail. None if the execution doesn't fail.
>   @JsonIgnore private[sql] var executionFailure: Option[Exception] = None
> }
> {code}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org