You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pig.apache.org by "liyunzhang_intel (JIRA)" <ji...@apache.org> on 2017/02/23 08:22:44 UTC

[jira] [Commented] (PIG-5134) fix TestAvroStorage unit test failures after PIG-5132

    [ https://issues.apache.org/jira/browse/PIG-5134?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15880096#comment-15880096 ] 

liyunzhang_intel commented on PIG-5134:
---------------------------------------

[~nkollar]: TestAvroStorage is not included in test/spark-tests. before we did not test it.  are you sure this pass before we merge trunk changes?

> fix  TestAvroStorage unit test failures after PIG-5132
> ------------------------------------------------------
>
>                 Key: PIG-5134
>                 URL: https://issues.apache.org/jira/browse/PIG-5134
>             Project: Pig
>          Issue Type: Sub-task
>          Components: spark
>            Reporter: liyunzhang_intel
>            Assignee: Nandor Kollar
>             Fix For: spark-branch
>
>
> It seems that test fails, because Avro GenericData#Record doesn't implement Serializable interface:
> {code}
> 2017-02-23 09:14:41,887 ERROR [main] spark.JobGraphBuilder (JobGraphBuilder.java:sparkOperToRDD(183)) - throw exception in sparkOperToRDD: 
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0 in stage 9.0 (TID 9) had a not serializable result: org.apache.avro.generic.GenericData$Record
> Serialization stack:
> 	- object not serializable (class: org.apache.avro.generic.GenericData$Record, value: {"key": "stuff in closet", "value1": {"thing": "hat", "count": 7}, "value2": {"thing": "coat", "count": 2}})
> 	- field (class: org.apache.pig.impl.util.avro.AvroTupleWrapper, name: avroObject, type: interface org.apache.avro.generic.IndexedRecord)
> 	- object (class org.apache.pig.impl.util.avro.AvroTupleWrapper, org.apache.pig.impl.util.avro.AvroTupleWrapper@3d3a58c1)
> 	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
> 	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
> {code}
> The failing tests is a new test introduced with merging trunk to spark branch, that's why we didn't see this error before.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)