You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Stefano Baghino <st...@radicalbit.io> on 2016/06/23 14:22:56 UTC

Scala/ReactiveMongo: type classes, macros and java.util.Serializable

Hello everybody,

in the past days, I've written batch input/output formats for MongoDB.

Initially, I've tried to use the non-blocking ReactiveMongo
<http://reactivemongo.org/> driver, which uses the type class pattern in
Scala for the serialization logic. The library also exposes some pretty
neat macros that automatically generate the type class instances for you,
given a case class.

Now, the problem is that these macros are not useful in Flink because the
generated type class instances would have to be serializable (something
that has been understandably left out from the macros). Has anyone ever
faced a similar problem? I've encountered it again when using upickle
<http://www.lihaoyi.com/upickle-pprint/upickle/>, which has a similar
facility but for JSON serialization.

In the end I've resorted to writing my own serialization logic and
explicitly extending java.util.Serializable in the end but I feel there may
be a way to not do this (without rewriting/extending the macros to make the
generated classes serializable).

-- 
BR,
Stefano Baghino

Software Engineer @ Radicalbit

Re: Scala/ReactiveMongo: type classes, macros and java.util.Serializable

Posted by Aljoscha Krettek <al...@apache.org>.
Hi,
could you maybe write TypeInformation/TypeSerializer wrappers that lazily
instantiate a type class-based serializer. It might even work using a "lazy
val". Something like this:

class ScalaTypeSerializer[T] extends TypeSerializer[T] {
  lazy val serializer = "create the scala serializer"
   ...

  def serialize(value: T, out: DataOutputView): () = {
    serializer.serialize(value, out) // not sure how the generated
serializers would be used, just a placeholder
  }
}

Cheers,
Aljoscha

On Thu, 23 Jun 2016 at 16:23 Stefano Baghino <st...@radicalbit.io>
wrote:

> Hello everybody,
>
> in the past days, I've written batch input/output formats for MongoDB.
>
> Initially, I've tried to use the non-blocking ReactiveMongo
> <http://reactivemongo.org/> driver, which uses the type class pattern in
> Scala for the serialization logic. The library also exposes some pretty
> neat macros that automatically generate the type class instances for you,
> given a case class.
>
> Now, the problem is that these macros are not useful in Flink because the
> generated type class instances would have to be serializable (something
> that has been understandably left out from the macros). Has anyone ever
> faced a similar problem? I've encountered it again when using upickle
> <http://www.lihaoyi.com/upickle-pprint/upickle/>, which has a similar
> facility but for JSON serialization.
>
> In the end I've resorted to writing my own serialization logic and
> explicitly extending java.util.Serializable in the end but I feel there may
> be a way to not do this (without rewriting/extending the macros to make the
> generated classes serializable).
>
> --
> BR,
> Stefano Baghino
>
> Software Engineer @ Radicalbit
>