You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/03/07 08:45:33 UTC
[jira] [Commented] (SPARK-19751) Create Data frame API fails with a
self referencing bean
[ https://issues.apache.org/jira/browse/SPARK-19751?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15898967#comment-15898967 ]
Apache Spark commented on SPARK-19751:
--------------------------------------
User 'maropu' has created a pull request for this issue:
https://github.com/apache/spark/pull/17188
> Create Data frame API fails with a self referencing bean
> --------------------------------------------------------
>
> Key: SPARK-19751
> URL: https://issues.apache.org/jira/browse/SPARK-19751
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.1.0
> Reporter: Avinash Venkateshaiah
> Priority: Minor
>
> createDataset API throws a stack overflow exception when we try creating a
> Dataset using a bean encoder. The bean is self referencing
> BEAN:
> public class HierObj implements Serializable {
> String name;
> List<HierObj> children;
> public String getName() {
> return name;
> }
> public void setName(String name) {
> this.name = name;
> }
> public List<HierObj> getChildren() {
> return children;
> }
> public void setChildren(List<HierObj> children) {
> this.children = children;
> }
> }
> // create an object
> HierObj hierObj = new HierObj();
> hierObj.setName("parent");
> List children = new ArrayList();
> HierObj child1 = new HierObj();
> child1.setName("child1");
> HierObj child2 = new HierObj();
> child2.setName("child2");
> children.add(child1);
> children.add(child2);
> hierObj.setChildren(children);
> // create a dataset
> Dataset ds = sparkSession().createDataset(Arrays.asList(hierObj), Encoders.bean(HierObj.class));
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org