You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by V0lleyBallJunki3 <ve...@gmail.com> on 2018/08/16 01:59:52 UTC

java.lang.UnsupportedOperationException: No Encoder found for Set[String]

Hello,
  I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program

val spark = SparkSession.builder().master("local[4]").getOrCreate()

case class TestCC(i: Int, ss: Set[String])

import spark.implicits._
import spark.sqlContext.implicits._

val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
"XYZ"))).toDS()


I get :
java.lang.UnsupportedOperationException: No Encoder found for Set[String]
- field (class: "scala.collection.immutable.Set", name: "ss")
- root class: "TestCC"
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
  at
scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
  at
org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
  at
org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
  at
org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
  at
org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
  at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)

To the best of my knowledge implicit support for Set has been added in Spark
2.2. Am I missing something?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: java.lang.UnsupportedOperationException: No Encoder found for Set[String]

Posted by Manu Zhang <ow...@gmail.com>.
You may try applying this PR  https://github.com/apache/spark/pull/18416.

On Fri, Aug 17, 2018 at 9:13 AM Venkat Dabri <ve...@gmail.com> wrote:

> We are using spark 2.2.0. Is it possible to bring the
> ExpressionEncoder from 2.3.0 and related classes into my code base and
> use them? I see the changes in ExpressionEncoder between 2.3.0 and
> 2.2.0 is not much but there might be many other classes underneath
> that might have changed.
>
> On Thu, Aug 16, 2018 at 5:23 AM, Manu Zhang <ow...@gmail.com>
> wrote:
> > Hi,
> >
> > It's added since Spark 2.3.0.
> >
> https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SQLImplicits.scala#L180
> >
> > Regards,
> > Manu Zhang
> >
> > On Thu, Aug 16, 2018 at 9:59 AM V0lleyBallJunki3 <ve...@gmail.com>
> > wrote:
> >>
> >> Hello,
> >>   I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program
> >>
> >> val spark = SparkSession.builder().master("local[4]").getOrCreate()
> >>
> >> case class TestCC(i: Int, ss: Set[String])
> >>
> >> import spark.implicits._
> >> import spark.sqlContext.implicits._
> >>
> >> val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
> >> "XYZ"))).toDS()
> >>
> >>
> >> I get :
> >> java.lang.UnsupportedOperationException: No Encoder found for
> Set[String]
> >> - field (class: "scala.collection.immutable.Set", name: "ss")
> >> - root class: "TestCC"
> >>   at
> >>
> >>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
> >>   at
> >>
> >>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
> >>   at
> >>
> >>
> scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
> >>   at
> >>
> >>
> org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
> >>   at
> >>
> >>
> org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
> >>   at
> >>
> >>
> org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
> >>   at
> >>
> >>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
> >>   at
> >>
> >>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
> >>   at
> >>
> >>
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
> >>
> >> To the best of my knowledge implicit support for Set has been added in
> >> Spark
> >> 2.2. Am I missing something?
> >>
> >>
> >>
> >> --
> >> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
> >>
> >
>

Re: java.lang.UnsupportedOperationException: No Encoder found for Set[String]

Posted by Venkat Dabri <ve...@gmail.com>.
We are using spark 2.2.0. Is it possible to bring the
ExpressionEncoder from 2.3.0 and related classes into my code base and
use them? I see the changes in ExpressionEncoder between 2.3.0 and
2.2.0 is not much but there might be many other classes underneath
that might have changed.

On Thu, Aug 16, 2018 at 5:23 AM, Manu Zhang <ow...@gmail.com> wrote:
> Hi,
>
> It's added since Spark 2.3.0.
> https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SQLImplicits.scala#L180
>
> Regards,
> Manu Zhang
>
> On Thu, Aug 16, 2018 at 9:59 AM V0lleyBallJunki3 <ve...@gmail.com>
> wrote:
>>
>> Hello,
>>   I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program
>>
>> val spark = SparkSession.builder().master("local[4]").getOrCreate()
>>
>> case class TestCC(i: Int, ss: Set[String])
>>
>> import spark.implicits._
>> import spark.sqlContext.implicits._
>>
>> val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
>> "XYZ"))).toDS()
>>
>>
>> I get :
>> java.lang.UnsupportedOperationException: No Encoder found for Set[String]
>> - field (class: "scala.collection.immutable.Set", name: "ss")
>> - root class: "TestCC"
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
>>   at
>>
>> scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
>>   at
>>
>> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
>>   at
>>
>> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
>>
>> To the best of my knowledge implicit support for Set has been added in
>> Spark
>> 2.2. Am I missing something?
>>
>>
>>
>> --
>> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: java.lang.UnsupportedOperationException: No Encoder found for Set[String]

Posted by Manu Zhang <ow...@gmail.com>.
Hi,

It's added since Spark 2.3.0.
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SQLImplicits.scala#L180

Regards,
Manu Zhang

On Thu, Aug 16, 2018 at 9:59 AM V0lleyBallJunki3 <ve...@gmail.com>
wrote:

> Hello,
>   I am using Spark 2.2.2 with Scala 2.11.8. I wrote a short program
>
> val spark = SparkSession.builder().master("local[4]").getOrCreate()
>
> case class TestCC(i: Int, ss: Set[String])
>
> import spark.implicits._
> import spark.sqlContext.implicits._
>
> val testCCDS = Seq(TestCC(1,Set("SS","Salil")), TestCC(2, Set("xx",
> "XYZ"))).toDS()
>
>
> I get :
> java.lang.UnsupportedOperationException: No Encoder found for Set[String]
> - field (class: "scala.collection.immutable.Set", name: "ss")
> - root class: "TestCC"
>   at
>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:632)
>   at
>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:455)
>   at
>
> scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
>   at
>
> org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:809)
>   at
>
> org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39)
>   at
>
> org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:455)
>   at
>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:626)
>   at
>
> org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1$$anonfun$10.apply(ScalaReflection.scala:614)
>   at
>
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
>
> To the best of my knowledge implicit support for Set has been added in
> Spark
> 2.2. Am I missing something?
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>