You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2019/10/30 16:36:00 UTC

[jira] [Comment Edited] (SPARK-29662) Cannot have circular references in bean class, but got the circular reference of class class io.cdap.cdap.api.data.schema.Schema

    [ https://issues.apache.org/jira/browse/SPARK-29662?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16963198#comment-16963198 ] 

Dongjoon Hyun edited comment on SPARK-29662 at 10/30/19 4:35 PM:
-----------------------------------------------------------------

Hi, [~coudray]. What is `CDAP`? Since Apache Spark 2.3.x is EOL, could you try Apache Spark 2.4.4?
We regularly closes outdated JIRA which reports only EOL releases (<= 2.3.x).


was (Author: dongjoon):
Hi, [~coudray]. What is `CDAP`? Since Apache Spark 2.3.x is EOL, could you try Apache Spark 2.4.4?

> Cannot have circular references in bean class, but got the circular reference of class class io.cdap.cdap.api.data.schema.Schema
> --------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-29662
>                 URL: https://issues.apache.org/jira/browse/SPARK-29662
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.4
>            Reporter: romain
>            Priority: Major
>
> i'm unable to convert JavaRdd<StructuredRecord> to DataSet<Row> 
> i'm using cdap 6.0.0 or 5.1.2 with spark 2.3.4
> Encoder<StructuredRecord> encoderStruct = Encoders.bean(StructuredRecord.class);
> this line make this error: 
> "Cannot have circular references in bean class, but got the circular reference of class class io.cdap.cdap.api.data.schema.Schema"



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org