You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jesse F Chen <jf...@us.ibm.com> on 2016/05/09 20:24:46 UTC

spark 2.0 issue with yarn?


I had been running fine until builds around 05/07/2016....

If I used the "--master yarn" in builds after 05/07, I got the following
error...sounds like something jars are missing.

I am using YARN 2.7.2 and Hive 1.2.1.

Do I need something new to deploy related to YARN?

bin/spark-sql -driver-memory 10g --verbose --master yarn --packages
com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors
20 --executor-cores 2

16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/05/09 13:15:21 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4041
16/05/09 13:15:21 INFO util.Utils: Successfully started service 'SparkUI'
on port 4041.
16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at
http://bigaperf116.svl.ibm.com:4041
Exception in thread "main" java.lang.NoClassDefFoundError:
com/sun/jersey/api/client/config/ClientConfig
	at
org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient
(TimelineClient.java:45)
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit
(YarnClientImpl.java:163)
	at org.apache.hadoop.service.AbstractService.init
(AbstractService.java:163)
	at org.apache.spark.deploy.yarn.Client.submitApplication
(Client.scala:150)
	at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start
(YarnClientSchedulerBackend.scala:56)
	at org.apache.spark.scheduler.TaskSchedulerImpl.start
(TaskSchedulerImpl.scala:148)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
	at org.apache.spark.SparkContext$.getOrCreate
(SparkContext.scala:2246)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate
(SparkSession.scala:762)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init
(SparkSQLEnv.scala:57)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>
(SparkSQLCLIDriver.scala:281)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main
(SparkSQLCLIDriver.scala:138)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main
(SparkSQLCLIDriver.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
$SparkSubmit$$runMain(SparkSubmit.scala:727)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1
(SparkSubmit.scala:183)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException:
com.sun.jersey.api.client.config.ClientConfig
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 22 more
16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
directory /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
directory /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
directory /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2




                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                   JESSE CHEN                                                                                                 
                                   Big Data Performance | IBM Analytics                                                                       
                                                                                                                                              
                                   Office:  408 463 2296                                                                                      
                                   Mobile: 408 828 9068                                                                                       
                                   Email:   jfchen@us.ibm.com                                                                                 
                                                                                                                                              
                                                                                                                                              



Re: spark 2.0 issue with yarn?

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Mon, May 9, 2016 at 3:34 PM, Matt Cheah <mc...@palantir.com> wrote:
> @Marcelo: Interesting - why would this manifest on the YARN-client side
> though (as Spark is the client to YARN in this case)? Spark as a client
> shouldn’t care about what auxiliary services are on the YARN cluster.

The ATS client is based on jersey 1.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: spark 2.0 issue with yarn?

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Mon, May 9, 2016 at 3:34 PM, Matt Cheah <mc...@palantir.com> wrote:
> @Marcelo: Interesting - why would this manifest on the YARN-client side
> though (as Spark is the client to YARN in this case)? Spark as a client
> shouldn’t care about what auxiliary services are on the YARN cluster.

The ATS client is based on jersey 1.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: spark 2.0 issue with yarn?

Posted by Matt Cheah <mc...@palantir.com>.
@Marcelo: Interesting - why would this manifest on the YARN-client side
though (as Spark is the client to YARN in this case)? Spark as a client
shouldn’t care about what auxiliary services are on the YARN cluster.

@Jesse: The change I wrote excludes all artifacts from the com.sun.jersey
group. So to allow usage of the timeline service, we specifically need to
find the thing that we don’t want to exclude and exclude everything but
that from the YARN dependencies.

I think the class that’s missing in this particular stack trace comes from
com.sun.jersey:jersey-client. Refactoring the exclusions in pom.xml under
hadoop-yarn-api to not exclude jersey-client should fix at least this.
Again – we want to exclude all the other Jersey things that YARN is
pulling in though, unless further errors indicate otherwise.

In general I’m wary of having both Jersey 1 and Jersey 2 jars on the class
path at all. However jersey-client looks relatively harmless since it does
not bundle in JAX-RS classes, nor does it appear to have anything weird in
its META-INF folder.

-Matt Cheah



On 5/9/16, 3:10 PM, "Marcelo Vanzin" <va...@cloudera.com> wrote:

>Hi Jesse,
>
>On Mon, May 9, 2016 at 2:52 PM, Jesse F Chen <jf...@us.ibm.com> wrote:
>> Sean - thanks. definitely related to SPARK-12154.
>> Is there a way to continue use Jersey 1 for existing working
>>environment?
>
>The error you're getting is because of a third-party extension that
>tries to talk to the YARN ATS; that's not part of upstream Spark,
>although I believe it's part of HDP. So you may have to talk to
>Hortonworks about that, or disable that extension in Spark 2.0 for the
>moment.
>
>
>-- 
>Marcelo


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: spark 2.0 issue with yarn?

Posted by Matt Cheah <mc...@palantir.com>.
@Marcelo: Interesting - why would this manifest on the YARN-client side
though (as Spark is the client to YARN in this case)? Spark as a client
shouldn’t care about what auxiliary services are on the YARN cluster.

@Jesse: The change I wrote excludes all artifacts from the com.sun.jersey
group. So to allow usage of the timeline service, we specifically need to
find the thing that we don’t want to exclude and exclude everything but
that from the YARN dependencies.

I think the class that’s missing in this particular stack trace comes from
com.sun.jersey:jersey-client. Refactoring the exclusions in pom.xml under
hadoop-yarn-api to not exclude jersey-client should fix at least this.
Again – we want to exclude all the other Jersey things that YARN is
pulling in though, unless further errors indicate otherwise.

In general I’m wary of having both Jersey 1 and Jersey 2 jars on the class
path at all. However jersey-client looks relatively harmless since it does
not bundle in JAX-RS classes, nor does it appear to have anything weird in
its META-INF folder.

-Matt Cheah



On 5/9/16, 3:10 PM, "Marcelo Vanzin" <va...@cloudera.com> wrote:

>Hi Jesse,
>
>On Mon, May 9, 2016 at 2:52 PM, Jesse F Chen <jf...@us.ibm.com> wrote:
>> Sean - thanks. definitely related to SPARK-12154.
>> Is there a way to continue use Jersey 1 for existing working
>>environment?
>
>The error you're getting is because of a third-party extension that
>tries to talk to the YARN ATS; that's not part of upstream Spark,
>although I believe it's part of HDP. So you may have to talk to
>Hortonworks about that, or disable that extension in Spark 2.0 for the
>moment.
>
>
>-- 
>Marcelo


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: spark 2.0 issue with yarn?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Hi Jesse,

On Mon, May 9, 2016 at 2:52 PM, Jesse F Chen <jf...@us.ibm.com> wrote:
> Sean - thanks. definitely related to SPARK-12154.
> Is there a way to continue use Jersey 1 for existing working environment?

The error you're getting is because of a third-party extension that
tries to talk to the YARN ATS; that's not part of upstream Spark,
although I believe it's part of HDP. So you may have to talk to
Hortonworks about that, or disable that extension in Spark 2.0 for the
moment.


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: spark 2.0 issue with yarn?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Hi Jesse,

On Mon, May 9, 2016 at 2:52 PM, Jesse F Chen <jf...@us.ibm.com> wrote:
> Sean - thanks. definitely related to SPARK-12154.
> Is there a way to continue use Jersey 1 for existing working environment?

The error you're getting is because of a third-party extension that
tries to talk to the YARN ATS; that's not part of upstream Spark,
although I believe it's part of HDP. So you may have to talk to
Hortonworks about that, or disable that extension in Spark 2.0 for the
moment.


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: spark 2.0 issue with yarn?

Posted by Jesse F Chen <jf...@us.ibm.com>.
Sean - thanks. definitely related to SPARK-12154.

Is there a way to continue use Jersey 1 for existing working environment?

Or, what's the best way to patch up existing Jersey 1 environment to Jersey
2?

This does break all of our Spark jobs running Spark 2.0 on YARN.

                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                   JESSE CHEN                                                                                                 
                                   Big Data Performance | IBM Analytics                                                                       
                                                                                                                                              
                                   Office:  408 463 2296                                                                                      
                                   Mobile: 408 828 9068                                                                                       
                                   Email:   jfchen@us.ibm.com                                                                                 
                                                                                                                                              
                                                                                                                                              






From:	Sean Owen <so...@cloudera.com>
To:	Jesse F Chen/San Francisco/IBM@IBMUS
Cc:	spark users <us...@spark.apache.org>, dev
            <de...@spark.apache.org>, Roy Cecil <ro...@ie.ibm.com>, Matt
            Cheah <mc...@palantir.com>
Date:	05/09/2016 02:19 PM
Subject:	Re: spark 2.0 issue with yarn?



Hm, this may be related to updating to Jersey 2, which happened 4 days
ago: https://issues.apache.org/jira/browse/SPARK-12154

That is a Jersey 1 class that's missing. How are you building and running
Spark?

I think the theory was that Jersey 1 would still be supplied at runtime. We
may have to revise the exclusions.

On Mon, May 9, 2016 at 9:24 PM, Jesse F Chen <jf...@us.ibm.com> wrote:
  I had been running fine until builds around 05/07/2016....

  If I used the "--master yarn" in builds after 05/07, I got the following
  error...sounds like something jars are missing.

  I am using YARN 2.7.2 and Hive 1.2.1.

  Do I need something new to deploy related to YARN?

  bin/spark-sql -driver-memory 10g --verbose --master yarn --packages
  com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors
  20 --executor-cores 2

  16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
  16/05/09 13:15:21 INFO server.AbstractConnector: Started
  SelectChannelConnector@0.0.0.0:4041
  16/05/09 13:15:21 INFO util.Utils: Successfully started service 'SparkUI'
  on port 4041.
  16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started
  at http://bigaperf116.svl.ibm.com:4041
  Exception in thread "main" java.lang.NoClassDefFoundError:
  com/sun/jersey/api/client/config/ClientConfig
  at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient
  (TimelineClient.java:45)
  at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit
  (YarnClientImpl.java:163)
  at org.apache.hadoop.service.AbstractService.init
  (AbstractService.java:163)
  at org.apache.spark.deploy.yarn.Client.submitApplication
  (Client.scala:150)
  at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start
  (YarnClientSchedulerBackend.scala:56)
  at org.apache.spark.scheduler.TaskSchedulerImpl.start
  (TaskSchedulerImpl.scala:148)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate
  (SparkSession.scala:762)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init
  (SparkSQLEnv.scala:57)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>
  (SparkSQLCLIDriver.scala:281)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main
  (SparkSQLCLIDriver.scala:138)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main
  (SparkSQLCLIDriver.scala)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke
  (NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke
  (DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:497)
  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
  $SparkSubmit$$runMain(SparkSubmit.scala:727)
  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1
  (SparkSubmit.scala:183)
  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.sun.jersey.api.client.config.ClientConfig
  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 22 more
  16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2








                                                                                                                                            
                                                                                                                                            
                                                                                                                                            
                                                                                                                                            
                             JESSE CHEN                                                                                                     
                             Big Data Performance | IBM Analytics                                                                           
                                                                                                                                            
                             Office: 408 463 2296                                                                                           
                             Mobile: 408 828 9068                                                                                           
                             Email: jfchen@us.ibm.com                                                                                       
                                                                                                                                            
                                                                                                                                            













Re: spark 2.0 issue with yarn?

Posted by Jesse F Chen <jf...@us.ibm.com>.
Sean - thanks. definitely related to SPARK-12154.

Is there a way to continue use Jersey 1 for existing working environment?

Or, what's the best way to patch up existing Jersey 1 environment to Jersey
2?

This does break all of our Spark jobs running Spark 2.0 on YARN.

                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                                                                                                                              
                                   JESSE CHEN                                                                                                 
                                   Big Data Performance | IBM Analytics                                                                       
                                                                                                                                              
                                   Office:  408 463 2296                                                                                      
                                   Mobile: 408 828 9068                                                                                       
                                   Email:   jfchen@us.ibm.com                                                                                 
                                                                                                                                              
                                                                                                                                              






From:	Sean Owen <so...@cloudera.com>
To:	Jesse F Chen/San Francisco/IBM@IBMUS
Cc:	spark users <us...@spark.apache.org>, dev
            <de...@spark.apache.org>, Roy Cecil <ro...@ie.ibm.com>, Matt
            Cheah <mc...@palantir.com>
Date:	05/09/2016 02:19 PM
Subject:	Re: spark 2.0 issue with yarn?



Hm, this may be related to updating to Jersey 2, which happened 4 days
ago: https://issues.apache.org/jira/browse/SPARK-12154

That is a Jersey 1 class that's missing. How are you building and running
Spark?

I think the theory was that Jersey 1 would still be supplied at runtime. We
may have to revise the exclusions.

On Mon, May 9, 2016 at 9:24 PM, Jesse F Chen <jf...@us.ibm.com> wrote:
  I had been running fine until builds around 05/07/2016....

  If I used the "--master yarn" in builds after 05/07, I got the following
  error...sounds like something jars are missing.

  I am using YARN 2.7.2 and Hive 1.2.1.

  Do I need something new to deploy related to YARN?

  bin/spark-sql -driver-memory 10g --verbose --master yarn --packages
  com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors
  20 --executor-cores 2

  16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
  16/05/09 13:15:21 INFO server.AbstractConnector: Started
  SelectChannelConnector@0.0.0.0:4041
  16/05/09 13:15:21 INFO util.Utils: Successfully started service 'SparkUI'
  on port 4041.
  16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started
  at http://bigaperf116.svl.ibm.com:4041
  Exception in thread "main" java.lang.NoClassDefFoundError:
  com/sun/jersey/api/client/config/ClientConfig
  at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient
  (TimelineClient.java:45)
  at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit
  (YarnClientImpl.java:163)
  at org.apache.hadoop.service.AbstractService.init
  (AbstractService.java:163)
  at org.apache.spark.deploy.yarn.Client.submitApplication
  (Client.scala:150)
  at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start
  (YarnClientSchedulerBackend.scala:56)
  at org.apache.spark.scheduler.TaskSchedulerImpl.start
  (TaskSchedulerImpl.scala:148)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate
  (SparkSession.scala:762)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init
  (SparkSQLEnv.scala:57)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>
  (SparkSQLCLIDriver.scala:281)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main
  (SparkSQLCLIDriver.scala:138)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main
  (SparkSQLCLIDriver.scala)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke
  (NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke
  (DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:497)
  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
  $SparkSubmit$$runMain(SparkSubmit.scala:727)
  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1
  (SparkSubmit.scala:183)
  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.sun.jersey.api.client.config.ClientConfig
  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 22 more
  16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2








                                                                                                                                            
                                                                                                                                            
                                                                                                                                            
                                                                                                                                            
                             JESSE CHEN                                                                                                     
                             Big Data Performance | IBM Analytics                                                                           
                                                                                                                                            
                             Office: 408 463 2296                                                                                           
                             Mobile: 408 828 9068                                                                                           
                             Email: jfchen@us.ibm.com                                                                                       
                                                                                                                                            
                                                                                                                                            













Re: spark 2.0 issue with yarn?

Posted by Sean Owen <so...@cloudera.com>.
The reason I ask how you're running is that YARN itself should have the
classes that YARN needs, and should be on your classpath. That's why it's
not in Spark.

There still could be a subtler problem. The best way to trouble shoot, if
you want to act on it now, is to have a look at the exclusions for Jersey 1
artifacts put in place in that PR, and try removing ones related to YARN,
and see if that remedies it. If so we may need to undo that part.

On Mon, May 9, 2016 at 10:52 PM, Jesse F Chen <jf...@us.ibm.com> wrote:

> Sean - thanks. definitely related to SPARK-12154.
>
> Is there a way to continue use Jersey 1 for existing working environment?
>
> Or, what's the best way to patch up existing Jersey 1 environment to
> Jersey 2?
>
> This does break all of our Spark jobs running Spark 2.0 on YARN.
>
>
> *JESSE CHEN*
> Big Data Performance | IBM Analytics
>
> Office: 408 463 2296
> Mobile: 408 828 9068
> Email: jfchen@us.ibm.com
>
> [image: Inactive hide details for Sean Owen ---05/09/2016 02:19:53
> PM---Hm, this may be related to updating to Jersey 2, which happened]Sean
> Owen ---05/09/2016 02:19:53 PM---Hm, this may be related to updating to
> Jersey 2, which happened 4 days ago: https://issues.apache.or
>
> From: Sean Owen <so...@cloudera.com>
> To: Jesse F Chen/San Francisco/IBM@IBMUS
> Cc: spark users <us...@spark.apache.org>, dev <de...@spark.apache.org>, Roy
> Cecil <ro...@ie.ibm.com>, Matt Cheah <mc...@palantir.com>
> Date: 05/09/2016 02:19 PM
> Subject: Re: spark 2.0 issue with yarn?
> ------------------------------
>
>
>
> Hm, this may be related to updating to Jersey 2, which happened 4 days
> ago: *https://issues.apache.org/jira/browse/SPARK-12154*
> <https://issues.apache.org/jira/browse/SPARK-12154>
>
> That is a Jersey 1 class that's missing. How are you building and running
> Spark?
>
> I think the theory was that Jersey 1 would still be supplied at runtime.
> We may have to revise the exclusions.
>
> On Mon, May 9, 2016 at 9:24 PM, Jesse F Chen <*jfchen@us.ibm.com*
> <jf...@us.ibm.com>> wrote:
>
>    I had been running fine until builds around 05/07/2016....
>
>    If I used the "--master yarn" in builds after 05/07, I got the
>    following error...sounds like something jars are missing.
>
>    I am using YARN 2.7.2 and Hive 1.2.1.
>
>    Do I need something new to deploy related to YARN?
>
>    bin/spark-sql -driver-memory 10g --verbose* --master yarn* --packages
>    com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors 20
>    --executor-cores 2
>
>    16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
>    16/05/09 13:15:21 INFO server.AbstractConnector: Started
>    *SelectChannelConnector@0.0.0.0:4041*
>    <http://SelectChannelConnector@0.0.0.0:4041/>
>    16/05/09 13:15:21 INFO util.Utils: Successfully started service
>    'SparkUI' on port 4041.
>    16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and
>    started at *http://bigaperf116.svl.ibm.com:4041*
>    <http://bigaperf116.svl.ibm.com:4041/>
>
> * Exception in thread "main" java.lang.NoClassDefFoundError:
>    com/sun/jersey/api/client/config/ClientConfig at
>    org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45)*
>    at
>    org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163)
>    at
>    org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>    at
>    org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:150)
>    at
>    org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>    at
>    org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:148)
>    at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
>    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
>    at
>    org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:762)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:281)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:138)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
>    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>    at
>    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    at java.lang.reflect.Method.invoke(Method.java:497)
>    at
>    org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:727)
>    at
>    org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
>    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
>    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
>    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>    Caused by: java.lang.ClassNotFoundException:
>    com.sun.jersey.api.client.config.ClientConfig
>    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>    ... 22 more
>    16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
>    /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
>    /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
>    /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2
>
>
>
>
>
>
> *JESSE CHEN*
> Big Data Performance | IBM Analytics
>
> Office: 408 463 2296
> Mobile: 408 828 9068
> Email: *jfchen@us.ibm.com* <jf...@us.ibm.com>
>
>
>
>
>
>

Re: spark 2.0 issue with yarn?

Posted by Sean Owen <so...@cloudera.com>.
The reason I ask how you're running is that YARN itself should have the
classes that YARN needs, and should be on your classpath. That's why it's
not in Spark.

There still could be a subtler problem. The best way to trouble shoot, if
you want to act on it now, is to have a look at the exclusions for Jersey 1
artifacts put in place in that PR, and try removing ones related to YARN,
and see if that remedies it. If so we may need to undo that part.

On Mon, May 9, 2016 at 10:52 PM, Jesse F Chen <jf...@us.ibm.com> wrote:

> Sean - thanks. definitely related to SPARK-12154.
>
> Is there a way to continue use Jersey 1 for existing working environment?
>
> Or, what's the best way to patch up existing Jersey 1 environment to
> Jersey 2?
>
> This does break all of our Spark jobs running Spark 2.0 on YARN.
>
>
> *JESSE CHEN*
> Big Data Performance | IBM Analytics
>
> Office: 408 463 2296
> Mobile: 408 828 9068
> Email: jfchen@us.ibm.com
>
> [image: Inactive hide details for Sean Owen ---05/09/2016 02:19:53
> PM---Hm, this may be related to updating to Jersey 2, which happened]Sean
> Owen ---05/09/2016 02:19:53 PM---Hm, this may be related to updating to
> Jersey 2, which happened 4 days ago: https://issues.apache.or
>
> From: Sean Owen <so...@cloudera.com>
> To: Jesse F Chen/San Francisco/IBM@IBMUS
> Cc: spark users <us...@spark.apache.org>, dev <de...@spark.apache.org>, Roy
> Cecil <ro...@ie.ibm.com>, Matt Cheah <mc...@palantir.com>
> Date: 05/09/2016 02:19 PM
> Subject: Re: spark 2.0 issue with yarn?
> ------------------------------
>
>
>
> Hm, this may be related to updating to Jersey 2, which happened 4 days
> ago: *https://issues.apache.org/jira/browse/SPARK-12154*
> <https://issues.apache.org/jira/browse/SPARK-12154>
>
> That is a Jersey 1 class that's missing. How are you building and running
> Spark?
>
> I think the theory was that Jersey 1 would still be supplied at runtime.
> We may have to revise the exclusions.
>
> On Mon, May 9, 2016 at 9:24 PM, Jesse F Chen <*jfchen@us.ibm.com*
> <jf...@us.ibm.com>> wrote:
>
>    I had been running fine until builds around 05/07/2016....
>
>    If I used the "--master yarn" in builds after 05/07, I got the
>    following error...sounds like something jars are missing.
>
>    I am using YARN 2.7.2 and Hive 1.2.1.
>
>    Do I need something new to deploy related to YARN?
>
>    bin/spark-sql -driver-memory 10g --verbose* --master yarn* --packages
>    com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors 20
>    --executor-cores 2
>
>    16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
>    16/05/09 13:15:21 INFO server.AbstractConnector: Started
>    *SelectChannelConnector@0.0.0.0:4041*
>    <http://SelectChannelConnector@0.0.0.0:4041/>
>    16/05/09 13:15:21 INFO util.Utils: Successfully started service
>    'SparkUI' on port 4041.
>    16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and
>    started at *http://bigaperf116.svl.ibm.com:4041*
>    <http://bigaperf116.svl.ibm.com:4041/>
>
> * Exception in thread "main" java.lang.NoClassDefFoundError:
>    com/sun/jersey/api/client/config/ClientConfig at
>    org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45)*
>    at
>    org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163)
>    at
>    org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>    at
>    org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:150)
>    at
>    org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>    at
>    org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:148)
>    at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
>    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
>    at
>    org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:762)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:281)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:138)
>    at
>    org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
>    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>    at
>    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    at java.lang.reflect.Method.invoke(Method.java:497)
>    at
>    org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:727)
>    at
>    org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
>    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
>    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
>    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>    Caused by: java.lang.ClassNotFoundException:
>    com.sun.jersey.api.client.config.ClientConfig
>    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>    ... 22 more
>    16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
>    /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
>    /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
>    16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
>    /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2
>
>
>
>
>
>
> *JESSE CHEN*
> Big Data Performance | IBM Analytics
>
> Office: 408 463 2296
> Mobile: 408 828 9068
> Email: *jfchen@us.ibm.com* <jf...@us.ibm.com>
>
>
>
>
>
>

Re: spark 2.0 issue with yarn?

Posted by Sean Owen <so...@cloudera.com>.
Hm, this may be related to updating to Jersey 2, which happened 4 days ago:
https://issues.apache.org/jira/browse/SPARK-12154

That is a Jersey 1 class that's missing. How are you building and running
Spark?

I think the theory was that Jersey 1 would still be supplied at runtime. We
may have to revise the exclusions.

On Mon, May 9, 2016 at 9:24 PM, Jesse F Chen <jf...@us.ibm.com> wrote:

> I had been running fine until builds around 05/07/2016....
>
> If I used the "--master yarn" in builds after 05/07, I got the following
> error...sounds like something jars are missing.
>
> I am using YARN 2.7.2 and Hive 1.2.1.
>
> Do I need something new to deploy related to YARN?
>
> bin/spark-sql -driver-memory 10g --verbose* --master yarn* --packages
> com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors 20
> --executor-cores 2
>
> 16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
> 16/05/09 13:15:21 INFO server.AbstractConnector: Started
> SelectChannelConnector@0.0.0.0:4041
> 16/05/09 13:15:21 INFO util.Utils: Successfully started service 'SparkUI'
> on port 4041.
> 16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started
> at http://bigaperf116.svl.ibm.com:4041
> *Exception in thread "main" java.lang.NoClassDefFoundError:
> com/sun/jersey/api/client/config/ClientConfig*
> * at
> org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45)*
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163)
> at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
> at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:150)
> at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
> at
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:148)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
> at
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:762)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:281)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:138)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:727)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException:
> com.sun.jersey.api.client.config.ClientConfig
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ... 22 more
> 16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
> /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
> /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
> /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2
>
>
>
>
>
> *JESSE CHEN*
> Big Data Performance | IBM Analytics
>
> Office: 408 463 2296
> Mobile: 408 828 9068
> Email: jfchen@us.ibm.com
>
>

Re: spark 2.0 issue with yarn?

Posted by Sean Owen <so...@cloudera.com>.
Hm, this may be related to updating to Jersey 2, which happened 4 days ago:
https://issues.apache.org/jira/browse/SPARK-12154

That is a Jersey 1 class that's missing. How are you building and running
Spark?

I think the theory was that Jersey 1 would still be supplied at runtime. We
may have to revise the exclusions.

On Mon, May 9, 2016 at 9:24 PM, Jesse F Chen <jf...@us.ibm.com> wrote:

> I had been running fine until builds around 05/07/2016....
>
> If I used the "--master yarn" in builds after 05/07, I got the following
> error...sounds like something jars are missing.
>
> I am using YARN 2.7.2 and Hive 1.2.1.
>
> Do I need something new to deploy related to YARN?
>
> bin/spark-sql -driver-memory 10g --verbose* --master yarn* --packages
> com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors 20
> --executor-cores 2
>
> 16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
> 16/05/09 13:15:21 INFO server.AbstractConnector: Started
> SelectChannelConnector@0.0.0.0:4041
> 16/05/09 13:15:21 INFO util.Utils: Successfully started service 'SparkUI'
> on port 4041.
> 16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started
> at http://bigaperf116.svl.ibm.com:4041
> *Exception in thread "main" java.lang.NoClassDefFoundError:
> com/sun/jersey/api/client/config/ClientConfig*
> * at
> org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45)*
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163)
> at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
> at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:150)
> at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
> at
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:148)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
> at
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:762)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:281)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:138)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:727)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException:
> com.sun.jersey.api.client.config.ClientConfig
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ... 22 more
> 16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
> /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
> /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
> 16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting directory
> /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2
>
>
>
>
>
> *JESSE CHEN*
> Big Data Performance | IBM Analytics
>
> Office: 408 463 2296
> Mobile: 408 828 9068
> Email: jfchen@us.ibm.com
>
>