You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sun Rui (JIRA)" <ji...@apache.org> on 2015/10/14 06:32:05 UTC

[jira] [Commented] (SPARK-9302) Handle complex JSON types in collect()/head()

    [ https://issues.apache.org/jira/browse/SPARK-9302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14956237#comment-14956237 ] 

Sun Rui commented on SPARK-9302:
--------------------------------

This is fixed after supporting complex types in DataFrame was done.

> Handle complex JSON types in collect()/head()
> ---------------------------------------------
>
>                 Key: SPARK-9302
>                 URL: https://issues.apache.org/jira/browse/SPARK-9302
>             Project: Spark
>          Issue Type: Improvement
>          Components: SparkR
>    Affects Versions: 1.4.0, 1.4.1
>            Reporter: Sun Rui
>
> Reported in the mailing list by Exie <tf...@prodevelop.com.au>:
> {noformat}
> A sample record in raw JSON looks like this:
> {"version": 1,"event": "view","timestamp": 1427846422377,"system":
> "DCDS","asset": "6404476","assetType": "myType","assetCategory":
> "myCategory","extras": [{"name": "videoSource","value": "mySource"},{"name":
> "playerType","value": "Article"},{"name": "duration","value":
> "202088"}],"trackingId": "155629a0-d802-11e4-13ee-6884e43d6000","ipAddress":
> "165.69.2.4","title": "myTitle"}
> > head(mydf)
> Error in as.data.frame.default(x[[i]], optional = TRUE) : 
>   cannot coerce class ""jobj"" to a data.frame
> >
> > show(mydf)
> DataFrame[localEventDtTm:timestamp, asset:string, assetCategory:string, assetType:string, event:string, extras:array<struct&lt;name:string,value:string>>, ipAddress:string, memberId:string, system:string, timestamp:bigint, title:string, trackingId:string, version:bigint]
> >
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org