You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ritika Maheshwari (JIRA)" <ji...@apache.org> on 2017/10/11 02:52:00 UTC
[jira] [Commented] (SPARK-22241) Apache spark giving
InvalidSchemaException: Cannot write a schema with an empty group: optional
group element {
[ https://issues.apache.org/jira/browse/SPARK-22241?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16199724#comment-16199724 ]
Ritika Maheshwari commented on SPARK-22241:
-------------------------------------------
I know parquet does not allow empty struct types. But is this something that the Encoder should handle when generating the schema
> Apache spark giving InvalidSchemaException: Cannot write a schema with an empty group: optional group element {
> ---------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-22241
> URL: https://issues.apache.org/jira/browse/SPARK-22241
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.2.0
> Reporter: Ritika Maheshwari
> Priority: Minor
>
> I have a bean which has field of type Arraylist of Doubles. Then I do the following
> Encoder<T> beanEncoder = Encoders.bean(jClass);
> Dataset<T> df = spark.createDataset( Collections.singletonList((T)extractedObj),beanEncoder);
> The schema generated says:
> |-- pixelSpacing: array (nullable = true)
> |-- element: struct (containsNull = true)
> Now I try to save this Dataset as parquet
> df.write().mode(SaveMode.Append).parquet(jClass.getName()+"_parquet");
> and I get the error Caused by: org.apache.parquet.schema.
> InvalidSchemaException: Cannot write a schema with an empty group: optional group element {}
> Kindly direct how to specify an ArrayList of Strings or Doubles in the Bean passed to the encoders so that the schema generated can be saved in paraquet
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org