You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by gbop <li...@gmail.com> on 2015/10/19 20:15:53 UTC

new 1.5.1 behavior - exception on executor throws ClassNotFound on driver

I've been struggling with a particularly puzzling issue after upgrading to
Spark 1.5.1 from Spark 1.4.1.

When I use the MySQL JDBC connector and an exception (e.g.
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on the
executor, I get a ClassNotFoundException on the driver, which results in
this error (logs are abbreviated):



 In Spark 1.4.1, I get the following (logs are abbreviated):



I have seriously screwed up somewhere or this is a change in behavior that I
have not been able to find in the documentation. For those that are
interested, a full repro and logs follow.


---

I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in various
combinations of 
 * local/standalone mode
 * putting mysql on the classpath with --jars/building a fat jar with mysql
in it/manually running sc.addJar on the mysql jar 
 * --deploy-mode client/--deploy-mode cluster
but nothing seems to change.



Here is an example invocation, and the accompanying source code:




The source code:



And the build.sbt:




And here are the results when run against Spark 1.4.1 (build.sbt has been
updated accordingly)





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: new 1.5.1 behavior - exception on executor throws ClassNotFound on driver

Posted by Lij Tapel <li...@gmail.com>.
Wow thanks, that's a great place to start digging deeper. Would it be
appropriate to file this on JIRA? It makes spark 1.5.1 a bit of a deal
breaker for me but I wouldn't mind taking a shot at fixing it given some
guidance

On Mon, Oct 19, 2015 at 1:03 PM, Ted Yu <yu...@gmail.com> wrote:

> Interesting.
> I wonder why existing tests didn't catch that:
>
> class UnserializableException extends Exception {
> ./core/src/test/scala/org/apache/spark/rpc/RpcEnvSuite.scala
> class DAGSchedulerSuiteDummyException extends Exception
> ./core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala
> class TestException(msg: String) extends Exception(msg)
>
> ./streaming/src/test/scala/org/apache/spark/streaming/StreamingContextSuite.scala
>
>
> On Mon, Oct 19, 2015 at 12:02 PM, Lij Tapel <li...@gmail.com> wrote:
>
>> I have verified that there is only 5.1.34 on the classpath.
>>
>> Funnily enough, I have a repro that doesn't even use mysql so this seems
>> to be purely a classloader issue:
>>
>> source: http://pastebin.com/WMCMwM6T
>> 1.4.1: http://pastebin.com/x38DQY2p
>> 1.5.1: http://pastebin.com/DQd6k818
>>
>>
>>
>> On Mon, Oct 19, 2015 at 11:51 AM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Lij:
>>>
>>> jar tvf
>>> /Users/tyu/.m2/repository//mysql/mysql-connector-java/5.1.31/mysql-connector-java-5.1.31.jar
>>> | grep MySQLSyntaxErrorExceptio
>>>    914 Wed May 21 01:42:16 PDT 2014
>>> com/mysql/jdbc/exceptions/MySQLSyntaxErrorException.class
>>>    842 Wed May 21 01:42:18 PDT 2014
>>> com/mysql/jdbc/exceptions/jdbc4/MySQLSyntaxErrorException.class
>>>
>>> 5.1.34 has basically the same structure.
>>>
>>> Can you check if there is other version of mysql-connector-java on the
>>> classpath ?
>>>
>>> Thanks
>>>
>>> On Mon, Oct 19, 2015 at 11:26 AM, Lij Tapel <li...@gmail.com> wrote:
>>>
>>>> Sorry, here's the logs and source:
>>>>
>>>> The error I see in spark 1.5.1: http://pastebin.com/86K9WQ5f
>>>> * full logs here: http://pastebin.com/dfysSh9E
>>>>
>>>> What I used to see in spark 1.4.1: http://pastebin.com/eK3AZQFx
>>>> * full logs here: http://pastebin.com/iffSFFWW
>>>>
>>>> The source and build.sbt: http://pastebin.com/tUvcBerd
>>>>
>>>> On Mon, Oct 19, 2015 at 11:18 AM, Ted Yu <yu...@gmail.com> wrote:
>>>>
>>>>> The attachments didn't go through.
>>>>>
>>>>> Consider pastbebin'ning.
>>>>>
>>>>> Thanks
>>>>>
>>>>> On Mon, Oct 19, 2015 at 11:15 AM, gbop <li...@gmail.com> wrote:
>>>>>
>>>>>> I've been struggling with a particularly puzzling issue after
>>>>>> upgrading to
>>>>>> Spark 1.5.1 from Spark 1.4.1.
>>>>>>
>>>>>> When I use the MySQL JDBC connector and an exception (e.g.
>>>>>> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown
>>>>>> on the
>>>>>> executor, I get a ClassNotFoundException on the driver, which results
>>>>>> in
>>>>>> this error (logs are abbreviated):
>>>>>>
>>>>>>
>>>>>>
>>>>>>  In Spark 1.4.1, I get the following (logs are abbreviated):
>>>>>>
>>>>>>
>>>>>>
>>>>>> I have seriously screwed up somewhere or this is a change in behavior
>>>>>> that I
>>>>>> have not been able to find in the documentation. For those that are
>>>>>> interested, a full repro and logs follow.
>>>>>>
>>>>>>
>>>>>> ---
>>>>>>
>>>>>> I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in
>>>>>> various
>>>>>> combinations of
>>>>>>  * local/standalone mode
>>>>>>  * putting mysql on the classpath with --jars/building a fat jar with
>>>>>> mysql
>>>>>> in it/manually running sc.addJar on the mysql jar
>>>>>>  * --deploy-mode client/--deploy-mode cluster
>>>>>> but nothing seems to change.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Here is an example invocation, and the accompanying source code:
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> The source code:
>>>>>>
>>>>>>
>>>>>>
>>>>>> And the build.sbt:
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> And here are the results when run against Spark 1.4.1 (build.sbt has
>>>>>> been
>>>>>> updated accordingly)
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> View this message in context:
>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>> Nabble.com.
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: new 1.5.1 behavior - exception on executor throws ClassNotFound on driver

Posted by Ted Yu <yu...@gmail.com>.
Interesting.
I wonder why existing tests didn't catch that:

class UnserializableException extends Exception {
./core/src/test/scala/org/apache/spark/rpc/RpcEnvSuite.scala
class DAGSchedulerSuiteDummyException extends Exception
./core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala
class TestException(msg: String) extends Exception(msg)
./streaming/src/test/scala/org/apache/spark/streaming/StreamingContextSuite.scala


On Mon, Oct 19, 2015 at 12:02 PM, Lij Tapel <li...@gmail.com> wrote:

> I have verified that there is only 5.1.34 on the classpath.
>
> Funnily enough, I have a repro that doesn't even use mysql so this seems
> to be purely a classloader issue:
>
> source: http://pastebin.com/WMCMwM6T
> 1.4.1: http://pastebin.com/x38DQY2p
> 1.5.1: http://pastebin.com/DQd6k818
>
>
>
> On Mon, Oct 19, 2015 at 11:51 AM, Ted Yu <yu...@gmail.com> wrote:
>
>> Lij:
>>
>> jar tvf
>> /Users/tyu/.m2/repository//mysql/mysql-connector-java/5.1.31/mysql-connector-java-5.1.31.jar
>> | grep MySQLSyntaxErrorExceptio
>>    914 Wed May 21 01:42:16 PDT 2014
>> com/mysql/jdbc/exceptions/MySQLSyntaxErrorException.class
>>    842 Wed May 21 01:42:18 PDT 2014
>> com/mysql/jdbc/exceptions/jdbc4/MySQLSyntaxErrorException.class
>>
>> 5.1.34 has basically the same structure.
>>
>> Can you check if there is other version of mysql-connector-java on the
>> classpath ?
>>
>> Thanks
>>
>> On Mon, Oct 19, 2015 at 11:26 AM, Lij Tapel <li...@gmail.com> wrote:
>>
>>> Sorry, here's the logs and source:
>>>
>>> The error I see in spark 1.5.1: http://pastebin.com/86K9WQ5f
>>> * full logs here: http://pastebin.com/dfysSh9E
>>>
>>> What I used to see in spark 1.4.1: http://pastebin.com/eK3AZQFx
>>> * full logs here: http://pastebin.com/iffSFFWW
>>>
>>> The source and build.sbt: http://pastebin.com/tUvcBerd
>>>
>>> On Mon, Oct 19, 2015 at 11:18 AM, Ted Yu <yu...@gmail.com> wrote:
>>>
>>>> The attachments didn't go through.
>>>>
>>>> Consider pastbebin'ning.
>>>>
>>>> Thanks
>>>>
>>>> On Mon, Oct 19, 2015 at 11:15 AM, gbop <li...@gmail.com> wrote:
>>>>
>>>>> I've been struggling with a particularly puzzling issue after
>>>>> upgrading to
>>>>> Spark 1.5.1 from Spark 1.4.1.
>>>>>
>>>>> When I use the MySQL JDBC connector and an exception (e.g.
>>>>> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown
>>>>> on the
>>>>> executor, I get a ClassNotFoundException on the driver, which results
>>>>> in
>>>>> this error (logs are abbreviated):
>>>>>
>>>>>
>>>>>
>>>>>  In Spark 1.4.1, I get the following (logs are abbreviated):
>>>>>
>>>>>
>>>>>
>>>>> I have seriously screwed up somewhere or this is a change in behavior
>>>>> that I
>>>>> have not been able to find in the documentation. For those that are
>>>>> interested, a full repro and logs follow.
>>>>>
>>>>>
>>>>> ---
>>>>>
>>>>> I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in
>>>>> various
>>>>> combinations of
>>>>>  * local/standalone mode
>>>>>  * putting mysql on the classpath with --jars/building a fat jar with
>>>>> mysql
>>>>> in it/manually running sc.addJar on the mysql jar
>>>>>  * --deploy-mode client/--deploy-mode cluster
>>>>> but nothing seems to change.
>>>>>
>>>>>
>>>>>
>>>>> Here is an example invocation, and the accompanying source code:
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> The source code:
>>>>>
>>>>>
>>>>>
>>>>> And the build.sbt:
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> And here are the results when run against Spark 1.4.1 (build.sbt has
>>>>> been
>>>>> updated accordingly)
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: new 1.5.1 behavior - exception on executor throws ClassNotFound on driver

Posted by Lij Tapel <li...@gmail.com>.
I have verified that there is only 5.1.34 on the classpath.

Funnily enough, I have a repro that doesn't even use mysql so this seems to
be purely a classloader issue:

source: http://pastebin.com/WMCMwM6T
1.4.1: http://pastebin.com/x38DQY2p
1.5.1: http://pastebin.com/DQd6k818



On Mon, Oct 19, 2015 at 11:51 AM, Ted Yu <yu...@gmail.com> wrote:

> Lij:
>
> jar tvf
> /Users/tyu/.m2/repository//mysql/mysql-connector-java/5.1.31/mysql-connector-java-5.1.31.jar
> | grep MySQLSyntaxErrorExceptio
>    914 Wed May 21 01:42:16 PDT 2014
> com/mysql/jdbc/exceptions/MySQLSyntaxErrorException.class
>    842 Wed May 21 01:42:18 PDT 2014
> com/mysql/jdbc/exceptions/jdbc4/MySQLSyntaxErrorException.class
>
> 5.1.34 has basically the same structure.
>
> Can you check if there is other version of mysql-connector-java on the
> classpath ?
>
> Thanks
>
> On Mon, Oct 19, 2015 at 11:26 AM, Lij Tapel <li...@gmail.com> wrote:
>
>> Sorry, here's the logs and source:
>>
>> The error I see in spark 1.5.1: http://pastebin.com/86K9WQ5f
>> * full logs here: http://pastebin.com/dfysSh9E
>>
>> What I used to see in spark 1.4.1: http://pastebin.com/eK3AZQFx
>> * full logs here: http://pastebin.com/iffSFFWW
>>
>> The source and build.sbt: http://pastebin.com/tUvcBerd
>>
>> On Mon, Oct 19, 2015 at 11:18 AM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> The attachments didn't go through.
>>>
>>> Consider pastbebin'ning.
>>>
>>> Thanks
>>>
>>> On Mon, Oct 19, 2015 at 11:15 AM, gbop <li...@gmail.com> wrote:
>>>
>>>> I've been struggling with a particularly puzzling issue after upgrading
>>>> to
>>>> Spark 1.5.1 from Spark 1.4.1.
>>>>
>>>> When I use the MySQL JDBC connector and an exception (e.g.
>>>> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on
>>>> the
>>>> executor, I get a ClassNotFoundException on the driver, which results in
>>>> this error (logs are abbreviated):
>>>>
>>>>
>>>>
>>>>  In Spark 1.4.1, I get the following (logs are abbreviated):
>>>>
>>>>
>>>>
>>>> I have seriously screwed up somewhere or this is a change in behavior
>>>> that I
>>>> have not been able to find in the documentation. For those that are
>>>> interested, a full repro and logs follow.
>>>>
>>>>
>>>> ---
>>>>
>>>> I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in
>>>> various
>>>> combinations of
>>>>  * local/standalone mode
>>>>  * putting mysql on the classpath with --jars/building a fat jar with
>>>> mysql
>>>> in it/manually running sc.addJar on the mysql jar
>>>>  * --deploy-mode client/--deploy-mode cluster
>>>> but nothing seems to change.
>>>>
>>>>
>>>>
>>>> Here is an example invocation, and the accompanying source code:
>>>>
>>>>
>>>>
>>>>
>>>> The source code:
>>>>
>>>>
>>>>
>>>> And the build.sbt:
>>>>
>>>>
>>>>
>>>>
>>>> And here are the results when run against Spark 1.4.1 (build.sbt has
>>>> been
>>>> updated accordingly)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: new 1.5.1 behavior - exception on executor throws ClassNotFound on driver

Posted by Ted Yu <yu...@gmail.com>.
Lij:

jar tvf
/Users/tyu/.m2/repository//mysql/mysql-connector-java/5.1.31/mysql-connector-java-5.1.31.jar
| grep MySQLSyntaxErrorExceptio
   914 Wed May 21 01:42:16 PDT 2014
com/mysql/jdbc/exceptions/MySQLSyntaxErrorException.class
   842 Wed May 21 01:42:18 PDT 2014
com/mysql/jdbc/exceptions/jdbc4/MySQLSyntaxErrorException.class

5.1.34 has basically the same structure.

Can you check if there is other version of mysql-connector-java on the
classpath ?

Thanks

On Mon, Oct 19, 2015 at 11:26 AM, Lij Tapel <li...@gmail.com> wrote:

> Sorry, here's the logs and source:
>
> The error I see in spark 1.5.1: http://pastebin.com/86K9WQ5f
> * full logs here: http://pastebin.com/dfysSh9E
>
> What I used to see in spark 1.4.1: http://pastebin.com/eK3AZQFx
> * full logs here: http://pastebin.com/iffSFFWW
>
> The source and build.sbt: http://pastebin.com/tUvcBerd
>
> On Mon, Oct 19, 2015 at 11:18 AM, Ted Yu <yu...@gmail.com> wrote:
>
>> The attachments didn't go through.
>>
>> Consider pastbebin'ning.
>>
>> Thanks
>>
>> On Mon, Oct 19, 2015 at 11:15 AM, gbop <li...@gmail.com> wrote:
>>
>>> I've been struggling with a particularly puzzling issue after upgrading
>>> to
>>> Spark 1.5.1 from Spark 1.4.1.
>>>
>>> When I use the MySQL JDBC connector and an exception (e.g.
>>> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on
>>> the
>>> executor, I get a ClassNotFoundException on the driver, which results in
>>> this error (logs are abbreviated):
>>>
>>>
>>>
>>>  In Spark 1.4.1, I get the following (logs are abbreviated):
>>>
>>>
>>>
>>> I have seriously screwed up somewhere or this is a change in behavior
>>> that I
>>> have not been able to find in the documentation. For those that are
>>> interested, a full repro and logs follow.
>>>
>>>
>>> ---
>>>
>>> I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in various
>>> combinations of
>>>  * local/standalone mode
>>>  * putting mysql on the classpath with --jars/building a fat jar with
>>> mysql
>>> in it/manually running sc.addJar on the mysql jar
>>>  * --deploy-mode client/--deploy-mode cluster
>>> but nothing seems to change.
>>>
>>>
>>>
>>> Here is an example invocation, and the accompanying source code:
>>>
>>>
>>>
>>>
>>> The source code:
>>>
>>>
>>>
>>> And the build.sbt:
>>>
>>>
>>>
>>>
>>> And here are the results when run against Spark 1.4.1 (build.sbt has been
>>> updated accordingly)
>>>
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>

Re: new 1.5.1 behavior - exception on executor throws ClassNotFound on driver

Posted by Lij Tapel <li...@gmail.com>.
Sorry, here's the logs and source:

The error I see in spark 1.5.1: http://pastebin.com/86K9WQ5f
* full logs here: http://pastebin.com/dfysSh9E

What I used to see in spark 1.4.1: http://pastebin.com/eK3AZQFx
* full logs here: http://pastebin.com/iffSFFWW

The source and build.sbt: http://pastebin.com/tUvcBerd

On Mon, Oct 19, 2015 at 11:18 AM, Ted Yu <yu...@gmail.com> wrote:

> The attachments didn't go through.
>
> Consider pastbebin'ning.
>
> Thanks
>
> On Mon, Oct 19, 2015 at 11:15 AM, gbop <li...@gmail.com> wrote:
>
>> I've been struggling with a particularly puzzling issue after upgrading to
>> Spark 1.5.1 from Spark 1.4.1.
>>
>> When I use the MySQL JDBC connector and an exception (e.g.
>> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on
>> the
>> executor, I get a ClassNotFoundException on the driver, which results in
>> this error (logs are abbreviated):
>>
>>
>>
>>  In Spark 1.4.1, I get the following (logs are abbreviated):
>>
>>
>>
>> I have seriously screwed up somewhere or this is a change in behavior
>> that I
>> have not been able to find in the documentation. For those that are
>> interested, a full repro and logs follow.
>>
>>
>> ---
>>
>> I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in various
>> combinations of
>>  * local/standalone mode
>>  * putting mysql on the classpath with --jars/building a fat jar with
>> mysql
>> in it/manually running sc.addJar on the mysql jar
>>  * --deploy-mode client/--deploy-mode cluster
>> but nothing seems to change.
>>
>>
>>
>> Here is an example invocation, and the accompanying source code:
>>
>>
>>
>>
>> The source code:
>>
>>
>>
>> And the build.sbt:
>>
>>
>>
>>
>> And here are the results when run against Spark 1.4.1 (build.sbt has been
>> updated accordingly)
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: new 1.5.1 behavior - exception on executor throws ClassNotFound on driver

Posted by Ted Yu <yu...@gmail.com>.
The attachments didn't go through.

Consider pastbebin'ning.

Thanks

On Mon, Oct 19, 2015 at 11:15 AM, gbop <li...@gmail.com> wrote:

> I've been struggling with a particularly puzzling issue after upgrading to
> Spark 1.5.1 from Spark 1.4.1.
>
> When I use the MySQL JDBC connector and an exception (e.g.
> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on the
> executor, I get a ClassNotFoundException on the driver, which results in
> this error (logs are abbreviated):
>
>
>
>  In Spark 1.4.1, I get the following (logs are abbreviated):
>
>
>
> I have seriously screwed up somewhere or this is a change in behavior that
> I
> have not been able to find in the documentation. For those that are
> interested, a full repro and logs follow.
>
>
> ---
>
> I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in various
> combinations of
>  * local/standalone mode
>  * putting mysql on the classpath with --jars/building a fat jar with mysql
> in it/manually running sc.addJar on the mysql jar
>  * --deploy-mode client/--deploy-mode cluster
> but nothing seems to change.
>
>
>
> Here is an example invocation, and the accompanying source code:
>
>
>
>
> The source code:
>
>
>
> And the build.sbt:
>
>
>
>
> And here are the results when run against Spark 1.4.1 (build.sbt has been
> updated accordingly)
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>