You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/09/12 20:39:00 UTC

[jira] [Commented] (SPARK-21987) Spark 2.3 cannot read 2.2 event logs

    [ https://issues.apache.org/jira/browse/SPARK-21987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16163622#comment-16163622 ] 

Xiao Li commented on SPARK-21987:
---------------------------------

Thanks for reporting this! We need to ensure Spark 2.3 still can process 2.2 event logs and revert the changes in SparkPlanGraph

> Spark 2.3 cannot read 2.2 event logs
> ------------------------------------
>
>                 Key: SPARK-21987
>                 URL: https://issues.apache.org/jira/browse/SPARK-21987
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Marcelo Vanzin
>            Priority: Blocker
>
> Reported by [~jincheng] in a comment in SPARK-18085:
> {noformat}
> com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "metadata" (class org.apache.spark.sql.execution.SparkPlanInfo), not marked as ignorable (4 known properties: "simpleString", "nodeName", "children", "metrics"])
>  at [Source: {"Event":"org.apache.spark.sql.execution.ui.SparkListenerSQLExecutionStart","executionId":0,"description":"json at NativeMethodAccessorImpl.java:0","details":"org.apache.spark.sql.DataFrameWriter.json(DataFrameWriter.scala:487)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:498)\npy4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)\npy4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)\npy4j.Gateway.invoke(Gateway.java:280)\npy4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)\npy4j.commands.CallCommand.execute(CallCommand.java:79)\npy4j.GatewayConnection.run(GatewayConnection.java:214)\njava.lang.Thread.run(Thread.java:748)","physicalPlanDescription":"== Parsed Logical Plan ==\nRepartition 200, true\n+- LogicalRDD [uid#327L, gids#328]\n\n== Analyzed Logical Plan ==\nuid: bigint, gids: array<bigint>\nRepartition 200, true\n+- LogicalRDD [uid#327L, gids#328]\n\n== Optimized Logical Plan ==\nRepartition 200, true\n+- LogicalRDD [uid#327L, gids#328]\n\n== Physical Plan ==\nExchange RoundRobinPartitioning(200)\n+- Scan ExistingRDD[uid#327L,gids#328]","sparkPlanInfo":{"nodeName":"Exchange","simpleString":"Exchange RoundRobinPartitioning(200)","children":[{"nodeName":"ExistingRDD","simpleString":"Scan ExistingRDD[uid#327L,gids#328]","children":[],"metadata":{},"metrics":[{"name":"number of output rows","accumulatorId":140,"metricType":"sum"}]}],"metadata":{},"metrics":[{"name":"data size total (min, med, max)","accumulatorId":139,"metricType":"size"}]},"time":1504837052948}; line: 1, column: 1622] (through reference chain: org.apache.spark.sql.execution.ui.SparkListenerSQLExecutionStart["sparkPlanInfo"]->org.apache.spark.sql.execution.SparkPlanInfo["children"]->com.fasterxml.jackson.module.scala.deser.BuilderWrapper[0]->org.apache.spark.sql.execution.SparkPlanInfo["metadata"])
> 	at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:51)
> {noformat}
> This was caused by SPARK-17701 (which at this moment is still open even though the patch has been committed).



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org