You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by jay vyas <ja...@gmail.com> on 2014/11/17 01:12:21 UTC

Is there a way for scala compiler to catch unserializable app code?

This is more a curiosity than an immediate problem.

Here is my question: I ran into this easily solved issue
http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou
recently.  The solution was to replace my "class" with a scala singleton,
which i guess is readily serializable.

So its clear that spark needs to serialize objects which carry the driver
methods for an app, in order to run... but I'm wondering,,, maybe there is
a way to change or update the spark API to catch unserializable spark apps
at compile time?


-- 
jay vyas

Re: Is there a way for scala compiler to catch unserializable app code?

Posted by Andrew Ash <an...@andrewash.com>.
Hi Jay,

I just came across SPARK-720 Statically guarantee serialization will succeed
<https://issues.apache.org/jira/browse/SPARK-720> which sounds like exactly
what you're referring to.  Like Reynold I think it's not possible at this
time but it would be good to get your feedback on that ticket.

Andrew


On Sun, Nov 16, 2014 at 4:37 PM, Reynold Xin <rx...@databricks.com> wrote:

> That's a great idea and it is also a pain point for some users. However, it
> is not possible to solve this problem at compile time, because the content
> of serialization can only be determined at runtime.
>
> There are some efforts in Scala to help users avoid mistakes like this. One
> example project that is more researchy is Spore:
> http://docs.scala-lang.org/sips/pending/spores.html
>
>
>
> On Sun, Nov 16, 2014 at 4:12 PM, jay vyas <ja...@gmail.com>
> wrote:
>
> > This is more a curiosity than an immediate problem.
> >
> > Here is my question: I ran into this easily solved issue
> >
> >
> http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou
> > recently.  The solution was to replace my "class" with a scala singleton,
> > which i guess is readily serializable.
> >
> > So its clear that spark needs to serialize objects which carry the driver
> > methods for an app, in order to run... but I'm wondering,,, maybe there
> is
> > a way to change or update the spark API to catch unserializable spark
> apps
> > at compile time?
> >
> >
> > --
> > jay vyas
> >
>

Re: Is there a way for scala compiler to catch unserializable app code?

Posted by Reynold Xin <rx...@databricks.com>.
That's a great idea and it is also a pain point for some users. However, it
is not possible to solve this problem at compile time, because the content
of serialization can only be determined at runtime.

There are some efforts in Scala to help users avoid mistakes like this. One
example project that is more researchy is Spore:
http://docs.scala-lang.org/sips/pending/spores.html



On Sun, Nov 16, 2014 at 4:12 PM, jay vyas <ja...@gmail.com>
wrote:

> This is more a curiosity than an immediate problem.
>
> Here is my question: I ran into this easily solved issue
>
> http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou
> recently.  The solution was to replace my "class" with a scala singleton,
> which i guess is readily serializable.
>
> So its clear that spark needs to serialize objects which carry the driver
> methods for an app, in order to run... but I'm wondering,,, maybe there is
> a way to change or update the spark API to catch unserializable spark apps
> at compile time?
>
>
> --
> jay vyas
>