You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ravindra Pesala (JIRA)" <ji...@apache.org> on 2014/09/19 09:39:34 UTC

[jira] [Commented] (SPARK-3536) SELECT on empty parquet table throws exception

    [ https://issues.apache.org/jira/browse/SPARK-3536?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14140145#comment-14140145 ] 

Ravindra Pesala commented on SPARK-3536:
----------------------------------------

It return null metadata from parquet if querying on empty parquet file while calculating splits.So we should add null check and returns the empty splits solves the issue.

> SELECT on empty parquet table throws exception
> ----------------------------------------------
>
>                 Key: SPARK-3536
>                 URL: https://issues.apache.org/jira/browse/SPARK-3536
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Michael Armbrust
>              Labels: starter
>
> Reported by [~matei].  Reproduce as follows:
> {code}
> scala> case class Data(i: Int)
> defined class Data
> scala> createParquetFile[Data]("testParquet")
> scala> parquetFile("testParquet").count()
> 14/09/15 14:34:17 WARN scheduler.DAGScheduler: Creating new stage failed due to exception - job: 0
> java.lang.NullPointerException
> 	at org.apache.spark.sql.parquet.FilteringParquetRowInputFormat.getSplits(ParquetTableOperations.scala:438)
> 	at parquet.hadoop.ParquetInputFormat.getSplits(ParquetInputFormat.java:344)
> 	at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:95)
> 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
> 	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org