You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Corentin Kerisit <co...@gmail.com> on 2016/09/01 15:41:56 UTC

Error creating dataframe from schema with nested using case class

Hi all,

By migrating to Spark 2.0.0, one of my program now throws the following
runtime exception:

- java.lang.RuntimeException: conversions.ProtoTCConversion$Timestamp is
not a valid external type for schema of struct<seconds:bigint,nanos:int>

although Timestamp is defined as follows:

- case class GoogleTimestamp(seconds: Long, nanos: Int)

I create the dataframe using createDataFrame(rdd, schema) where schema
holds among other types:

StructType(List(
    StructField("seconds", LongType, false),
    StructField("nanos", IntegerType, false)
  ))

This code worked fine on 1.6.2 but doesn't on 2.0.0 and I have no idea what
is going wrong.

Help much appreciated.

*Corentin Kerisit*
VSRE compliant