You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2014/09/04 02:50:51 UTC

[jira] [Updated] (SPARK-3390) sqlContext.jsonRDD fails on a complex structure of array and hashmap nesting

     [ https://issues.apache.org/jira/browse/SPARK-3390?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust updated SPARK-3390:
------------------------------------
    Target Version/s: 1.2.0

> sqlContext.jsonRDD fails on a complex structure of array and hashmap nesting
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-3390
>                 URL: https://issues.apache.org/jira/browse/SPARK-3390
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.0.2
>            Reporter: Vida Ha
>            Assignee: Yin Huai
>            Priority: Critical
>
> I found a valid JSON string, but which Spark SQL fails to correctly parse
> Try running these lines in a spark-shell:
> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
> val badJson = "{\"foo\": [[{\"bar\": 0}]]}"
> val rdd = sc.parallelize(badJson :: Nil)
> sqlContext.jsonRDD(rdd).count()
> I've tried running these lines on the 1.0.2 release as well latest Spark1.1 release candidate, and I get this stack trace:
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 2.0:3 failed 1 times, most recent failure: Exception failure in TID 7 on host localhost: scala.MatchError: StructType(List()) (of class org.apache.spark.sql.catalyst.types.StructType)
>         org.apache.spark.sql.json.JsonRDD$.enforceCorrectType(JsonRDD.scala:333)
>         org.apache.spark.sql.json.JsonRDD$$anonfun$enforceCorrectType$1.apply(JsonRDD.scala:335)
>         scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>         scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>         scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>         scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>         scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>         scala.collection.AbstractTraversable.map(Traversable.scala:105)
>         org.apache.spark.sql.json.JsonRDD$.enforceCorrectType(JsonRDD.scala:335)
>         org.apache.spark.sql.json.JsonRDD$$anonfun$enforceCorrectType$1.apply(JsonRDD.scala:335)
>         scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>         scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>         scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>         scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>         scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>         scala.collection.AbstractTraversable.map(Traversable.scala:105)
>         org.apache.spark.sql.json.JsonRDD$.enforceCorrectType(JsonRDD.scala:335)
>         org.apache.spark.sql.json.JsonRDD$$anonfun$org$apache$spark$sql$json$JsonRDD$$asRow$1$$anonfun$apply$12.apply(JsonRDD.scala:365)
>         scala.Option.map(Option.scala:145)
>         org.apache.spark.sql.json.JsonRDD$$anonfun$org$apache$spark$sql$json$JsonRDD$$asRow$1.apply(JsonRDD.scala:364)
>         org.apache.spark.sql.json.JsonRDD$$anonfun$org$apache$spark$sql$json$JsonRDD$$asRow$1.apply(JsonRDD.scala:349)
>         scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>         scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>         org.apache.spark.sql.json.JsonRDD$.org$apache$spark$sql$json$JsonRDD$$asRow(JsonRDD.scala:349)
>         org.apache.spark.sql.json.JsonRDD$$anonfun$createLogicalPlan$1.apply(JsonRDD.scala:51)
>         org.apache.spark.sql.json.JsonRDD$$anonfun$createLogicalPlan$1.apply(JsonRDD.scala:51)
>         scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
>         scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> ....



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org