You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by anbu <an...@gmail.com> on 2018/04/04 14:58:22 UTC

ClassCastException: java.sql.Date cannot be cast to java.lang.String in Scala

Could you someone please help me how to fix this below error in spark 2.1.0
scala-2.11.8 Baically I'm migrating the code from spark 1.6.0 to
spark-2.1.0.

I'm getting the below exception in spark 2.1.0

Error: java.lang.ClassCastException: java.sql.Date cannot be cast to
java.lang.String at org.apache.spark.sql.Row$class.getString(Row.scala)

at
org.apache.spark.sql.catalyst.expressions.GenericRow.getString(rows.scala)
at

Existing code in spark 1.6.0

import java.text.SimpleDateFormat

val someRDD =RDD.groupByKey().mapPartitions

{ (iterator) => { val dataFormatter = new SimpleDateFormat("yyyy-MM-dd")
myList.map(ip => Row(ip._1.getInt(0),ip._1.getLong(1),

ip._1.getInt(2), new
java.sql.Date(dataFormatter.parse(ip._1.getString(7)).getTime),

new java.sql.Date(dataFormatter.parse(ip._1.getString(8)).getTime),

error is showing in the above 2 lines.

)).iterator}

I have tried the different approach with the following ways.

import java.sql.Date

Date.valueOf(ip._1.getString(7)).getTime,
Date.valueOf(ip._1.getString(8)).getTime,

Still I'm getting the same error saying "Caused by:
java.lang.ClassCastException: java.sql.Date cannot be cast to
java.lang.String at org.apache.spark.sql.Row$class.getString(Row.scala) at
org.apache.spark.sql.catalyst.expressions.GenericRow.getString(rows.scala)"

Please help me on this error in spark 2.1.0

Suggest me any site or document for spark migration



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org