You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Harish Butani (JIRA)" <ji...@apache.org> on 2015/06/04 03:05:38 UTC
[jira] [Updated] (SPARK-8093) Failure to save empty json object as
parquet
[ https://issues.apache.org/jira/browse/SPARK-8093?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Harish Butani updated SPARK-8093:
---------------------------------
Attachment: t1.json
> Failure to save empty json object as parquet
> --------------------------------------------
>
> Key: SPARK-8093
> URL: https://issues.apache.org/jira/browse/SPARK-8093
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.4.0
> Reporter: Harish Butani
> Attachments: t1.json
>
>
> This is similar to SPARK-3365. Sample json is attached. Code to reproduce
> {code}
> var jsonDF = read.json("/tmp/t1.json")
> jsonDF.write.parquet("/tmp/t1.parquet")
> {code}
> The 'integration' object is empty in the json.
> StackTrace:
> {code}
> ....
> Caused by: java.io.IOException: Could not read footer: java.lang.IllegalStateException: Cannot build an empty group
> at parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:238)
> at org.apache.spark.sql.parquet.ParquetRelation2$MetadataCache.refresh(newParquet.scala:369)
> at org.apache.spark.sql.parquet.ParquetRelation2.org$apache$spark$sql$parquet$ParquetRelation2$$metadataCache$lzycompute(newParquet.scala:154)
> at org.apache.spark.sql.parquet.ParquetRelation2.org$apache$spark$sql$parquet$ParquetRelation2$$metadataCache(newParquet.scala:152)
> at org.apache.spark.sql.parquet.ParquetRelation2.refresh(newParquet.scala:197)
> at org.apache.spark.sql.sources.InsertIntoHadoopFsRelation.insert(commands.scala:134)
> ... 69 more
> Caused by: java.lang.IllegalStateException: Cannot build an empty group
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org