You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/11/08 07:11:58 UTC
[jira] [Closed] (SPARK-10840) SparkSQL doesn't work well with JSON
[ https://issues.apache.org/jira/browse/SPARK-10840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin closed SPARK-10840.
-------------------------------
Resolution: Duplicate
> SparkSQL doesn't work well with JSON
> ------------------------------------
>
> Key: SPARK-10840
> URL: https://issues.apache.org/jira/browse/SPARK-10840
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Jordan Sarraf
> Priority: Minor
> Labels: JSON, Scala, SparkSQL
>
> Well formed JSON doesn't work with the 1.5.1 version while using sqlContext.read.json("<json-file>"):
> {
> "employees": {
> "employee": [
> {
> "name": "Mia",<newline>
> "surname": "Radison",<newline>
> "mobile": "7295913821",<newline>
> "email": "miaradison@sparky.com"
> },
> {
> "name": "Thor",<newline>
> "surname": "Kovaskz",<newline>
> "mobile": "8829177193",<newline>
> "email": "tkovaskz@sparky.com"
> },
> {
> "name": "Bindy",<newline>
> "surname": "Kvuls",<newline>
> "mobile": "5033828845",<newline>
> "email": "bindykk@sparky.com"
> }
> ]
> }
> }
> For the above following error is obtained:
> ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 2)
> scala.MatchError: (VALUE_STRING,StructType()) (of class scala.Tuple2)
> Where as, this works fine because all components are in the same line:
> [
> {"name": "Mia","surname": "Radison","mobile": "7295913821","email": "miaradison@sparky.com"},
> {"name": "Thor","surname": "Kovaskz","mobile": "8829177193","email": "tkovaskz@sparky.com"},
> {"name": "Bindy","surname": "Kvuls","mobile": "5033828845","email": "bindykk@sparky.com"}
> ]
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org