You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by abhishek <re...@gmail.com> on 2014/12/08 08:36:11 UTC

Is there a way to get column names using hiveContext ?

Hi,
 
I have iplRDD which is a json, and I do below steps and query through
hivecontext. I get the results but without columns headers. Is there is a
way to get the columns names ?

val teamRDD = hiveContext.jsonRDD(iplRDD)
teamRDD.registerTempTable("teams")
hiveContext.cacheTable("teams")

val result = hiveContext.sql("select * from teams where  team_name = "KKR" )
result.collect.foreach(println)

Any thoughts please ?




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-a-way-to-get-column-names-using-hiveContext-tp20574.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Is there a way to get column names using hiveContext ?

Posted by Michael Armbrust <mi...@databricks.com>.
You can call .schema on SchemaRDDs.  For example:

results.schema.fields.map(_.name)

On Sun, Dec 7, 2014 at 11:36 PM, abhishek <re...@gmail.com> wrote:

> Hi,
>
> I have iplRDD which is a json, and I do below steps and query through
> hivecontext. I get the results but without columns headers. Is there is a
> way to get the columns names ?
>
> val teamRDD = hiveContext.jsonRDD(iplRDD)
> teamRDD.registerTempTable("teams")
> hiveContext.cacheTable("teams")
>
> val result = hiveContext.sql("select * from teams where  team_name = "KKR"
> )
> result.collect.foreach(println)
>
> Any thoughts please ?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-a-way-to-get-column-names-using-hiveContext-tp20574.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>