You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dave Maughan <da...@gmail.com> on 2016/06/06 12:13:33 UTC

Spark SQL - Encoders - case class

Hi,

I've figured out how to select data from a remote Hive instance and encode
the DataFrame -> Dataset using a Java POJO class:

    TestHive.sql("select foo_bar as `fooBar` from table1"
).as(Encoders.bean(classOf[Table1])).show()

However, I'm struggling to find out to do the equivalent in Scala if Table1
is a case class. Could someone please point me in the right direction?

Thanks
- Dave

Re: Spark SQL - Encoders - case class

Posted by Dave Maughan <da...@gmail.com>.
Hi,

Thanks for the quick replies. I've tried those suggestions but Eclipse is
showing:

    *Unable** to find encoder for type stored in a Dataset.  Primitive
types (Int, String, etc) and Product types (case classes) are supported by
importing sqlContext.implicits._  Support for serializing other types will
be added in future.*


Thanks

- Dave

Re: Spark SQL - Encoders - case class

Posted by Han JU <ju...@gmail.com>.
Hi,

I think encoders for case classes are already provided in spark. You'll
just need to import them.

    val sql = new SQLContext(sc)
    import sql.implicits._

And then do the cast to Dataset.

2016-06-06 14:13 GMT+02:00 Dave Maughan <da...@gmail.com>:

> Hi,
>
> I've figured out how to select data from a remote Hive instance and encode
> the DataFrame -> Dataset using a Java POJO class:
>
>     TestHive.sql("select foo_bar as `fooBar` from table1"
> ).as(Encoders.bean(classOf[Table1])).show()
>
> However, I'm struggling to find out to do the equivalent in Scala if
> Table1 is a case class. Could someone please point me in the right
> direction?
>
> Thanks
> - Dave
>



-- 
*JU Han*

Software Engineer @ Teads.tv

+33 0619608888