You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Corey J. Nolet (JIRA)" <ji...@apache.org> on 2015/01/15 05:27:35 UTC

[jira] [Updated] (SPARK-5260) Expose JsonRDD.allKeysWithValueTypes() in a utility class

     [ https://issues.apache.org/jira/browse/SPARK-5260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Corey J. Nolet updated SPARK-5260:
----------------------------------
    Description: I have found this method extremely useful when implementing my own strategy for inferring a schema from parsed json. For now, I've actually copied the method right out of the JsonRDD class into my own project but I think it would be immensely useful to keep the code in Spark and expose it publicly somewhere else- like an object called JsonSchema.  (was: I have found this method extremely useful when implementing my own method for inferring a schema from parsed json. For now, I've actually copied the method right out of the JsonRDD class into my own project but I think it would be immensely useful to keep the code in Spark and expose it publicly somewhere else- like an object called JsonSchema.)

> Expose JsonRDD.allKeysWithValueTypes() in a utility class 
> ----------------------------------------------------------
>
>                 Key: SPARK-5260
>                 URL: https://issues.apache.org/jira/browse/SPARK-5260
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Corey J. Nolet
>             Fix For: 1.3.0
>
>
> I have found this method extremely useful when implementing my own strategy for inferring a schema from parsed json. For now, I've actually copied the method right out of the JsonRDD class into my own project but I think it would be immensely useful to keep the code in Spark and expose it publicly somewhere else- like an object called JsonSchema.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org