You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Judy Nash <ju...@exchange.microsoft.com> on 2014/12/01 07:53:36 UTC

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Thanks Patrick and Cheng for the suggestions.

The issue was Hadoop common jar was added to a classpath. After I removed Hadoop common jar from both master and slave, I was able to bypass the error. 
This was caused by a local change, so no impact on the 1.2 release. 
-----Original Message-----
From: Patrick Wendell [mailto:pwendell@gmail.com] 
Sent: Wednesday, November 26, 2014 8:17 AM
To: Judy Nash
Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Just to double check - I looked at our own assembly jar and I confirmed that our Hadoop configuration class does use the correctly shaded version of Guava. My best guess here is that somehow a separate Hadoop library is ending up on the classpath, possible because Spark put it there somehow.

> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
> cd org/apache/hadoop/
> javap -v Configuration | grep Precond

Warning: Binary file Configuration contains org.apache.hadoop.conf.Configuration

   #497 = Utf8               org/spark-project/guava/common/base/Preconditions

   #498 = Class              #497         //
"org/spark-project/guava/common/base/Preconditions"

   #502 = Methodref          #498.#501    //
"org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLjava/lang/Object;)V

        12: invokestatic  #502                // Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

        50: invokestatic  #502                // Method
"org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V

On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell <pw...@gmail.com> wrote:
> Hi Judy,
>
> Are you somehow modifying Spark's classpath to include jars from 
> Hadoop and Hive that you have running on the machine? The issue seems 
> to be that you are somehow including a version of Hadoop that 
> references the original guava package. The Hadoop that is bundled in 
> the Spark jars should not do this.
>
> - Patrick
>
> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash 
> <ju...@exchange.microsoft.com> wrote:
>> Looks like a config issue. I ran spark-pi job and still failing with 
>> the same guava error
>>
>> Command ran:
>>
>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077 
>> --executor-memory 1G --num-executors 1 
>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>
>>
>>
>> Had used the same build steps on spark 1.1 and had no issue.
>>
>>
>>
>> From: Denny Lee [mailto:denny.g.lee@gmail.com]
>> Sent: Tuesday, November 25, 2014 5:47 PM
>> To: Judy Nash; Cheng Lian; user@spark.incubator.apache.org
>>
>>
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> To determine if this is a Windows vs. other configuration, can you 
>> just try to call the Spark-class.cmd SparkSubmit without actually 
>> referencing the Hadoop or Thrift server classes?
>>
>>
>>
>>
>>
>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
>> <ju...@exchange.microsoft.com>
>> wrote:
>>
>> I traced the code and used the following to call:
>>
>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 
>> spark-internal --hiveconf hive.server2.thrift.port=10000
>>
>>
>>
>> The issue ended up to be much more fundamental however. Spark doesn't 
>> work at all in configuration below. When open spark-shell, it fails 
>> with the same ClassNotFound error.
>>
>> Now I wonder if this is a windows-only issue or the hive/Hadoop 
>> configuration that is having this problem.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>> Sent: Tuesday, November 25, 2014 1:50 AM
>>
>>
>> To: Judy Nash; user@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> Oh so you're using Windows. What command are you using to start the 
>> Thrift server then?
>>
>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>
>> Made progress but still blocked.
>>
>> After recompiling the code on cmd instead of PowerShell, now I can 
>> see all 5 classes as you mentioned.
>>
>> However I am still seeing the same error as before. Anything else I 
>> can check for?
>>
>>
>>
>> From: Judy Nash [mailto:judynash@exchange.microsoft.com]
>> Sent: Monday, November 24, 2014 11:50 PM
>> To: Cheng Lian; user@spark.incubator.apache.org
>> Subject: RE: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> This is what I got from jar tf:
>>
>> org/spark-project/guava/common/base/Preconditions.class
>>
>> org/spark-project/guava/common/math/MathPreconditions.class
>>
>> com/clearspring/analytics/util/Preconditions.class
>>
>> parquet/Preconditions.class
>>
>>
>>
>> I seem to have the line that reported missing, but I am missing this file:
>>
>> com/google/inject/internal/util/$Preconditions.class
>>
>>
>>
>> Any suggestion on how to fix this?
>>
>> Very much appreciate the help as I am very new to Spark and open 
>> source technologies.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>> Sent: Monday, November 24, 2014 8:24 PM
>> To: Judy Nash; user@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> Hm, I tried exactly the same commit and the build command locally, 
>> but couldn't reproduce this.
>>
>> Usually this kind of errors are caused by classpath misconfiguration. 
>> Could you please try this to ensure corresponding Guava classes are 
>> included in the assembly jar you built?
>>
>> jar tf
>> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.
>> jar | grep Preconditions
>>
>> On my machine I got these lines (the first line is the one reported 
>> as missing in your case):
>>
>> org/spark-project/guava/common/base/Preconditions.class
>>
>> org/spark-project/guava/common/math/MathPreconditions.class
>>
>> com/clearspring/analytics/util/Preconditions.class
>>
>> parquet/Preconditions.class
>>
>> com/google/inject/internal/util/$Preconditions.class
>>
>> On 11/25/14 6:25 AM, Judy Nash wrote:
>>
>> Thank you Cheng for responding.
>>
>>
>> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>>
>> commit 6f70e0295572e3037660004797040e026e440dbd
>>
>> Author: zsxwing <zs...@gmail.com>
>>
>> Date:   Fri Nov 21 00:42:43 2014 -0800
>>
>>
>>
>>     [SPARK-4472][Shell] Print "Spark context available as sc." only 
>> when SparkContext is created...
>>
>>
>>
>>     ... successfully
>>
>>
>>
>>     It's weird that printing "Spark context available as sc" when 
>> creating SparkContext unsuccessfully.
>>
>>
>>
>> Let me know if you need anything else.
>>
>>
>>
>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>> Sent: Friday, November 21, 2014 8:02 PM
>> To: Judy Nash; user@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>>
>>
>> Hi Judy, could you please provide the commit SHA1 of the version 
>> you're using? Thanks!
>>
>> On 11/22/14 11:05 AM, Judy Nash wrote:
>>
>> Hi,
>>
>>
>>
>> Thrift server is failing to start for me on latest spark 1.2 branch.
>>
>>
>>
>> I got the error below when I start thrift server.
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/bas
>>
>> e/Preconditions
>>
>>         at
>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>
>> ation.java:314)....
>>
>>
>>
>> Here is my setup:
>>
>> 1)      Latest spark 1.2 branch build
>>
>> 2)      Used build command:
>>
>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive 
>> -Phive-thriftserver -DskipTests clean package
>>
>> 3)      Added hive-site.xml to \conf
>>
>> 4)      Version on the box: Hive 0.13, Hadoop 2.4
>>
>>
>>
>> Is this a real bug or am I doing something wrong?
>>
>>
>>
>> -----------------------------------
>>
>> Full Stacktrace:
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/bas
>>
>> e/Preconditions
>>
>>         at
>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>
>> ation.java:314)
>>
>>         at
>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>
>> ation.java:327)
>>
>>         at
>> org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:409)
>>
>>
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU
>>
>> til.scala:82)
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:
>>
>> 42)
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala
>>
>> :202)
>>
>>         at
>> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.sca
>>
>> la)
>>
>>         at
>> org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
>>
>>         at
>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
>>
>>         at
>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
>>
>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
>>
>>         at 
>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
>>
>>         at 
>> org.apache.spark.SparkContext.<init>(SparkContext.scala:230)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
>>
>> scala:38)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh
>>
>> riftServer2.scala:56)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr
>>
>> iftServer2.scala)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
>> Method)
>>
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>
>> java:57)
>>
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
>>
>> sorImpl.java:43)
>>
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
>>
>>         at 
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>
>>         at 
>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> Caused by: java.lang.ClassNotFoundException:
>> com.google.common.base.Precondition
>>
>> s
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>
>>         at java.security.AccessController.doPrivileged(Native Method)
>>
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>
>>         at 
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>
>>
>>
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Posted by Judy Nash <ju...@exchange.microsoft.com>.
To report back how I ultimately solved this issue and someone else can do:

1) Check each jar class path and make sure the jars are listed in the order of Guava class version (i.e. spark-assembly needs to list before Hadoop 2.4 because spark-assembly has guava 14 and Hadoop 2.4 has guava 11). May require update compute-classpath.sh to get the ordering right. 

2) If the other jars uses a higher version, bump spark guava library to higher version. Guava supposedly to be very backward compatible.  

Hope this helps. 

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: Tuesday, December 2, 2014 11:35 AM
To: Judy Nash
Cc: Patrick Wendell; Denny Lee; Cheng Lian; user@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

On Tue, Dec 2, 2014 at 11:22 AM, Judy Nash <ju...@exchange.microsoft.com> wrote:
> Any suggestion on how can user with custom Hadoop jar solve this issue?

You'll need to include all the dependencies for that custom Hadoop jar to the classpath. Those will include Guava (which is not included in its original form as part of the Spark dependencies).



> -----Original Message-----
> From: Patrick Wendell [mailto:pwendell@gmail.com]
> Sent: Sunday, November 30, 2014 11:06 PM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with 
> NoClassDefFoundError on Guava
>
> Thanks Judy. While this is not directly caused by a Spark issue, it is likely other users will run into this. This is an unfortunate consequence of the way that we've shaded Guava in this release, we rely on byte code shading of Hadoop itself as well. And if the user has their own Hadoop classes present it can cause issues.
>
> On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash <ju...@exchange.microsoft.com> wrote:
>> Thanks Patrick and Cheng for the suggestions.
>>
>> The issue was Hadoop common jar was added to a classpath. After I removed Hadoop common jar from both master and slave, I was able to bypass the error.
>> This was caused by a local change, so no impact on the 1.2 release.
>> -----Original Message-----
>> From: Patrick Wendell [mailto:pwendell@gmail.com]
>> Sent: Wednesday, November 26, 2014 8:17 AM
>> To: Judy Nash
>> Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with 
>> NoClassDefFoundError on Guava
>>
>> Just to double check - I looked at our own assembly jar and I confirmed that our Hadoop configuration class does use the correctly shaded version of Guava. My best guess here is that somehow a separate Hadoop library is ending up on the classpath, possible because Spark put it there somehow.
>>
>>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>>> cd org/apache/hadoop/
>>> javap -v Configuration | grep Precond
>>
>> Warning: Binary file Configuration contains 
>> org.apache.hadoop.conf.Configuration
>>
>>    #497 = Utf8               org/spark-project/guava/common/base/Preconditions
>>
>>    #498 = Class              #497         //
>> "org/spark-project/guava/common/base/Preconditions"
>>
>>    #502 = Methodref          #498.#501    //
>> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZL
>> j
>> ava/lang/Object;)V
>>
>>         12: invokestatic  #502                // Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLj
>> a
>> va/lang/Object;)V
>>
>>         50: invokestatic  #502                // Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLj
>> a
>> va/lang/Object;)V
>>
>> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell <pw...@gmail.com> wrote:
>>> Hi Judy,
>>>
>>> Are you somehow modifying Spark's classpath to include jars from 
>>> Hadoop and Hive that you have running on the machine? The issue 
>>> seems to be that you are somehow including a version of Hadoop that 
>>> references the original guava package. The Hadoop that is bundled in 
>>> the Spark jars should not do this.
>>>
>>> - Patrick
>>>
>>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash 
>>> <ju...@exchange.microsoft.com> wrote:
>>>> Looks like a config issue. I ran spark-pi job and still failing 
>>>> with the same guava error
>>>>
>>>> Command ran:
>>>>
>>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
>>>> org.apache.spark.examples.SparkPi --master 
>>>> spark://headnodehost:7077 --executor-memory 1G --num-executors 1 
>>>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>>>
>>>>
>>>>
>>>> Had used the same build steps on spark 1.1 and had no issue.
>>>>
>>>>
>>>>
>>>> From: Denny Lee [mailto:denny.g.lee@gmail.com]
>>>> Sent: Tuesday, November 25, 2014 5:47 PM
>>>> To: Judy Nash; Cheng Lian; user@spark.incubator.apache.org
>>>>
>>>>
>>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> To determine if this is a Windows vs. other configuration, can you 
>>>> just try to call the Spark-class.cmd SparkSubmit without actually 
>>>> referencing the Hadoop or Thrift server classes?
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
>>>> <ju...@exchange.microsoft.com>
>>>> wrote:
>>>>
>>>> I traced the code and used the following to call:
>>>>
>>>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>>> spark-internal --hiveconf hive.server2.thrift.port=10000
>>>>
>>>>
>>>>
>>>> The issue ended up to be much more fundamental however. Spark 
>>>> doesn't work at all in configuration below. When open spark-shell, 
>>>> it fails with the same ClassNotFound error.
>>>>
>>>> Now I wonder if this is a windows-only issue or the hive/Hadoop 
>>>> configuration that is having this problem.
>>>>
>>>>
>>>>
>>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>>> Sent: Tuesday, November 25, 2014 1:50 AM
>>>>
>>>>
>>>> To: Judy Nash; user@spark.incubator.apache.org
>>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> Oh so you're using Windows. What command are you using to start the 
>>>> Thrift server then?
>>>>
>>>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>>>
>>>> Made progress but still blocked.
>>>>
>>>> After recompiling the code on cmd instead of PowerShell, now I can 
>>>> see all 5 classes as you mentioned.
>>>>
>>>> However I am still seeing the same error as before. Anything else I 
>>>> can check for?
>>>>
>>>>
>>>>
>>>> From: Judy Nash [mailto:judynash@exchange.microsoft.com]
>>>> Sent: Monday, November 24, 2014 11:50 PM
>>>> To: Cheng Lian; user@spark.incubator.apache.org
>>>> Subject: RE: latest Spark 1.2 thrift server fail with 
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> This is what I got from jar tf:
>>>>
>>>> org/spark-project/guava/common/base/Preconditions.class
>>>>
>>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>>
>>>> com/clearspring/analytics/util/Preconditions.class
>>>>
>>>> parquet/Preconditions.class
>>>>
>>>>
>>>>
>>>> I seem to have the line that reported missing, but I am missing this file:
>>>>
>>>> com/google/inject/internal/util/$Preconditions.class
>>>>
>>>>
>>>>
>>>> Any suggestion on how to fix this?
>>>>
>>>> Very much appreciate the help as I am very new to Spark and open 
>>>> source technologies.
>>>>
>>>>
>>>>
>>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>>> Sent: Monday, November 24, 2014 8:24 PM
>>>> To: Judy Nash; user@spark.incubator.apache.org
>>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> Hm, I tried exactly the same commit and the build command locally, 
>>>> but couldn't reproduce this.
>>>>
>>>> Usually this kind of errors are caused by classpath misconfiguration.
>>>> Could you please try this to ensure corresponding Guava classes are 
>>>> included in the assembly jar you built?
>>>>
>>>> jar tf
>>>> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.
>>>> jar | grep Preconditions
>>>>
>>>> On my machine I got these lines (the first line is the one reported 
>>>> as missing in your case):
>>>>
>>>> org/spark-project/guava/common/base/Preconditions.class
>>>>
>>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>>
>>>> com/clearspring/analytics/util/Preconditions.class
>>>>
>>>> parquet/Preconditions.class
>>>>
>>>> com/google/inject/internal/util/$Preconditions.class
>>>>
>>>> On 11/25/14 6:25 AM, Judy Nash wrote:
>>>>
>>>> Thank you Cheng for responding.
>>>>
>>>>
>>>> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>>>>
>>>> commit 6f70e0295572e3037660004797040e026e440dbd
>>>>
>>>> Author: zsxwing <zs...@gmail.com>
>>>>
>>>> Date:   Fri Nov 21 00:42:43 2014 -0800
>>>>
>>>>
>>>>
>>>>     [SPARK-4472][Shell] Print "Spark context available as sc." only 
>>>> when SparkContext is created...
>>>>
>>>>
>>>>
>>>>     ... successfully
>>>>
>>>>
>>>>
>>>>     It's weird that printing "Spark context available as sc" when 
>>>> creating SparkContext unsuccessfully.
>>>>
>>>>
>>>>
>>>> Let me know if you need anything else.
>>>>
>>>>
>>>>
>>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>>> Sent: Friday, November 21, 2014 8:02 PM
>>>> To: Judy Nash; user@spark.incubator.apache.org
>>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> Hi Judy, could you please provide the commit SHA1 of the version 
>>>> you're using? Thanks!
>>>>
>>>> On 11/22/14 11:05 AM, Judy Nash wrote:
>>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> Thrift server is failing to start for me on latest spark 1.2 branch.
>>>>
>>>>
>>>>
>>>> I got the error below when I start thrift server.
>>>>
>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> com/google/common/bas
>>>>
>>>> e/Preconditions
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Config
>>>> u
>>>> r
>>>>
>>>> ation.java:314)....
>>>>
>>>>
>>>>
>>>> Here is my setup:
>>>>
>>>> 1)      Latest spark 1.2 branch build
>>>>
>>>> 2)      Used build command:
>>>>
>>>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive 
>>>> -Phive-thriftserver -DskipTests clean package
>>>>
>>>> 3)      Added hive-site.xml to \conf
>>>>
>>>> 4)      Version on the box: Hive 0.13, Hadoop 2.4
>>>>
>>>>
>>>>
>>>> Is this a real bug or am I doing something wrong?
>>>>
>>>>
>>>>
>>>> -----------------------------------
>>>>
>>>> Full Stacktrace:
>>>>
>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> com/google/common/bas
>>>>
>>>> e/Preconditions
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Config
>>>> u
>>>> r
>>>>
>>>> ation.java:314)
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Config
>>>> u
>>>> r
>>>>
>>>> ation.java:327)
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:40
>>>> 9
>>>> )
>>>>
>>>>
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoo
>>>> p
>>>> U
>>>>
>>>> til.scala:82)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:
>>>>
>>>> 42)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.sca
>>>> l
>>>> a
>>>>
>>>> :202)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.s
>>>> c
>>>> a
>>>>
>>>> la)
>>>>
>>>>         at
>>>> org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
>>>>
>>>>         at
>>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105
>>>> )
>>>>
>>>>         at
>>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180
>>>> )
>>>>
>>>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
>>>>
>>>>         at
>>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
>>>>
>>>>         at
>>>> org.apache.spark.SparkContext.<init>(SparkContext.scala:230)
>>>>
>>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
>>>>
>>>> scala:38)
>>>>
>>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(Hive
>>>> T
>>>> h
>>>>
>>>> riftServer2.scala:56)
>>>>
>>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveT
>>>> h
>>>> r
>>>>
>>>> iftServer2.scala)
>>>>
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>> Method)
>>>>
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>>>
>>>> java:57)
>>>>
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcc
>>>> e
>>>> s
>>>>
>>>> sorImpl.java:43)
>>>>
>>>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> com.google.common.base.Precondition
>>>>
>>>> s
>>>>
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>
>>>>         at java.security.AccessController.doPrivileged(Native
>>>> Method)
>>>>
>>>>         at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>
>>>>         at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>>
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>
>>>>
>>>>
>>>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For 
> additional commands, e-mail: user-help@spark.apache.org
>



--
Marcelo

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Tue, Dec 2, 2014 at 11:22 AM, Judy Nash
<ju...@exchange.microsoft.com> wrote:
> Any suggestion on how can user with custom Hadoop jar solve this issue?

You'll need to include all the dependencies for that custom Hadoop jar
to the classpath. Those will include Guava (which is not included in
its original form as part of the Spark dependencies).



> -----Original Message-----
> From: Patrick Wendell [mailto:pwendell@gmail.com]
> Sent: Sunday, November 30, 2014 11:06 PM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava
>
> Thanks Judy. While this is not directly caused by a Spark issue, it is likely other users will run into this. This is an unfortunate consequence of the way that we've shaded Guava in this release, we rely on byte code shading of Hadoop itself as well. And if the user has their own Hadoop classes present it can cause issues.
>
> On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash <ju...@exchange.microsoft.com> wrote:
>> Thanks Patrick and Cheng for the suggestions.
>>
>> The issue was Hadoop common jar was added to a classpath. After I removed Hadoop common jar from both master and slave, I was able to bypass the error.
>> This was caused by a local change, so no impact on the 1.2 release.
>> -----Original Message-----
>> From: Patrick Wendell [mailto:pwendell@gmail.com]
>> Sent: Wednesday, November 26, 2014 8:17 AM
>> To: Judy Nash
>> Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
>> Subject: Re: latest Spark 1.2 thrift server fail with
>> NoClassDefFoundError on Guava
>>
>> Just to double check - I looked at our own assembly jar and I confirmed that our Hadoop configuration class does use the correctly shaded version of Guava. My best guess here is that somehow a separate Hadoop library is ending up on the classpath, possible because Spark put it there somehow.
>>
>>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>>> cd org/apache/hadoop/
>>> javap -v Configuration | grep Precond
>>
>> Warning: Binary file Configuration contains
>> org.apache.hadoop.conf.Configuration
>>
>>    #497 = Utf8               org/spark-project/guava/common/base/Preconditions
>>
>>    #498 = Class              #497         //
>> "org/spark-project/guava/common/base/Preconditions"
>>
>>    #502 = Methodref          #498.#501    //
>> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLj
>> ava/lang/Object;)V
>>
>>         12: invokestatic  #502                // Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
>> va/lang/Object;)V
>>
>>         50: invokestatic  #502                // Method
>> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
>> va/lang/Object;)V
>>
>> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell <pw...@gmail.com> wrote:
>>> Hi Judy,
>>>
>>> Are you somehow modifying Spark's classpath to include jars from
>>> Hadoop and Hive that you have running on the machine? The issue seems
>>> to be that you are somehow including a version of Hadoop that
>>> references the original guava package. The Hadoop that is bundled in
>>> the Spark jars should not do this.
>>>
>>> - Patrick
>>>
>>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash
>>> <ju...@exchange.microsoft.com> wrote:
>>>> Looks like a config issue. I ran spark-pi job and still failing with
>>>> the same guava error
>>>>
>>>> Command ran:
>>>>
>>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077
>>>> --executor-memory 1G --num-executors 1
>>>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>>>
>>>>
>>>>
>>>> Had used the same build steps on spark 1.1 and had no issue.
>>>>
>>>>
>>>>
>>>> From: Denny Lee [mailto:denny.g.lee@gmail.com]
>>>> Sent: Tuesday, November 25, 2014 5:47 PM
>>>> To: Judy Nash; Cheng Lian; user@spark.incubator.apache.org
>>>>
>>>>
>>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> To determine if this is a Windows vs. other configuration, can you
>>>> just try to call the Spark-class.cmd SparkSubmit without actually
>>>> referencing the Hadoop or Thrift server classes?
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash
>>>> <ju...@exchange.microsoft.com>
>>>> wrote:
>>>>
>>>> I traced the code and used the following to call:
>>>>
>>>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>>> spark-internal --hiveconf hive.server2.thrift.port=10000
>>>>
>>>>
>>>>
>>>> The issue ended up to be much more fundamental however. Spark
>>>> doesn't work at all in configuration below. When open spark-shell,
>>>> it fails with the same ClassNotFound error.
>>>>
>>>> Now I wonder if this is a windows-only issue or the hive/Hadoop
>>>> configuration that is having this problem.
>>>>
>>>>
>>>>
>>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>>> Sent: Tuesday, November 25, 2014 1:50 AM
>>>>
>>>>
>>>> To: Judy Nash; user@spark.incubator.apache.org
>>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> Oh so you're using Windows. What command are you using to start the
>>>> Thrift server then?
>>>>
>>>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>>>
>>>> Made progress but still blocked.
>>>>
>>>> After recompiling the code on cmd instead of PowerShell, now I can
>>>> see all 5 classes as you mentioned.
>>>>
>>>> However I am still seeing the same error as before. Anything else I
>>>> can check for?
>>>>
>>>>
>>>>
>>>> From: Judy Nash [mailto:judynash@exchange.microsoft.com]
>>>> Sent: Monday, November 24, 2014 11:50 PM
>>>> To: Cheng Lian; user@spark.incubator.apache.org
>>>> Subject: RE: latest Spark 1.2 thrift server fail with
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> This is what I got from jar tf:
>>>>
>>>> org/spark-project/guava/common/base/Preconditions.class
>>>>
>>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>>
>>>> com/clearspring/analytics/util/Preconditions.class
>>>>
>>>> parquet/Preconditions.class
>>>>
>>>>
>>>>
>>>> I seem to have the line that reported missing, but I am missing this file:
>>>>
>>>> com/google/inject/internal/util/$Preconditions.class
>>>>
>>>>
>>>>
>>>> Any suggestion on how to fix this?
>>>>
>>>> Very much appreciate the help as I am very new to Spark and open
>>>> source technologies.
>>>>
>>>>
>>>>
>>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>>> Sent: Monday, November 24, 2014 8:24 PM
>>>> To: Judy Nash; user@spark.incubator.apache.org
>>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> Hm, I tried exactly the same commit and the build command locally,
>>>> but couldn't reproduce this.
>>>>
>>>> Usually this kind of errors are caused by classpath misconfiguration.
>>>> Could you please try this to ensure corresponding Guava classes are
>>>> included in the assembly jar you built?
>>>>
>>>> jar tf
>>>> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.
>>>> jar | grep Preconditions
>>>>
>>>> On my machine I got these lines (the first line is the one reported
>>>> as missing in your case):
>>>>
>>>> org/spark-project/guava/common/base/Preconditions.class
>>>>
>>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>>
>>>> com/clearspring/analytics/util/Preconditions.class
>>>>
>>>> parquet/Preconditions.class
>>>>
>>>> com/google/inject/internal/util/$Preconditions.class
>>>>
>>>> On 11/25/14 6:25 AM, Judy Nash wrote:
>>>>
>>>> Thank you Cheng for responding.
>>>>
>>>>
>>>> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>>>>
>>>> commit 6f70e0295572e3037660004797040e026e440dbd
>>>>
>>>> Author: zsxwing <zs...@gmail.com>
>>>>
>>>> Date:   Fri Nov 21 00:42:43 2014 -0800
>>>>
>>>>
>>>>
>>>>     [SPARK-4472][Shell] Print "Spark context available as sc." only
>>>> when SparkContext is created...
>>>>
>>>>
>>>>
>>>>     ... successfully
>>>>
>>>>
>>>>
>>>>     It's weird that printing "Spark context available as sc" when
>>>> creating SparkContext unsuccessfully.
>>>>
>>>>
>>>>
>>>> Let me know if you need anything else.
>>>>
>>>>
>>>>
>>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>>> Sent: Friday, November 21, 2014 8:02 PM
>>>> To: Judy Nash; user@spark.incubator.apache.org
>>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>>> NoClassDefFoundError on Guava
>>>>
>>>>
>>>>
>>>> Hi Judy, could you please provide the commit SHA1 of the version
>>>> you're using? Thanks!
>>>>
>>>> On 11/22/14 11:05 AM, Judy Nash wrote:
>>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> Thrift server is failing to start for me on latest spark 1.2 branch.
>>>>
>>>>
>>>>
>>>> I got the error below when I start thrift server.
>>>>
>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> com/google/common/bas
>>>>
>>>> e/Preconditions
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configu
>>>> r
>>>>
>>>> ation.java:314)....
>>>>
>>>>
>>>>
>>>> Here is my setup:
>>>>
>>>> 1)      Latest spark 1.2 branch build
>>>>
>>>> 2)      Used build command:
>>>>
>>>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive
>>>> -Phive-thriftserver -DskipTests clean package
>>>>
>>>> 3)      Added hive-site.xml to \conf
>>>>
>>>> 4)      Version on the box: Hive 0.13, Hadoop 2.4
>>>>
>>>>
>>>>
>>>> Is this a real bug or am I doing something wrong?
>>>>
>>>>
>>>>
>>>> -----------------------------------
>>>>
>>>> Full Stacktrace:
>>>>
>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> com/google/common/bas
>>>>
>>>> e/Preconditions
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configu
>>>> r
>>>>
>>>> ation.java:314)
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configu
>>>> r
>>>>
>>>> ation.java:327)
>>>>
>>>>         at
>>>> org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:409
>>>> )
>>>>
>>>>
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoop
>>>> U
>>>>
>>>> til.scala:82)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:
>>>>
>>>> 42)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scal
>>>> a
>>>>
>>>> :202)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.sc
>>>> a
>>>>
>>>> la)
>>>>
>>>>         at
>>>> org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
>>>>
>>>>         at
>>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
>>>>
>>>>         at
>>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
>>>>
>>>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
>>>>
>>>>         at
>>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
>>>>
>>>>         at
>>>> org.apache.spark.SparkContext.<init>(SparkContext.scala:230)
>>>>
>>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
>>>>
>>>> scala:38)
>>>>
>>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveT
>>>> h
>>>>
>>>> riftServer2.scala:56)
>>>>
>>>>         at
>>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveTh
>>>> r
>>>>
>>>> iftServer2.scala)
>>>>
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>> Method)
>>>>
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>>>
>>>> java:57)
>>>>
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>>> s
>>>>
>>>> sorImpl.java:43)
>>>>
>>>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> com.google.common.base.Precondition
>>>>
>>>> s
>>>>
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>
>>>>         at java.security.AccessController.doPrivileged(Native
>>>> Method)
>>>>
>>>>         at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>
>>>>         at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>>
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>
>>>>
>>>>
>>>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Posted by Judy Nash <ju...@exchange.microsoft.com>.
Any suggestion on how can user with custom Hadoop jar solve this issue? 

-----Original Message-----
From: Patrick Wendell [mailto:pwendell@gmail.com] 
Sent: Sunday, November 30, 2014 11:06 PM
To: Judy Nash
Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Thanks Judy. While this is not directly caused by a Spark issue, it is likely other users will run into this. This is an unfortunate consequence of the way that we've shaded Guava in this release, we rely on byte code shading of Hadoop itself as well. And if the user has their own Hadoop classes present it can cause issues.

On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash <ju...@exchange.microsoft.com> wrote:
> Thanks Patrick and Cheng for the suggestions.
>
> The issue was Hadoop common jar was added to a classpath. After I removed Hadoop common jar from both master and slave, I was able to bypass the error.
> This was caused by a local change, so no impact on the 1.2 release.
> -----Original Message-----
> From: Patrick Wendell [mailto:pwendell@gmail.com]
> Sent: Wednesday, November 26, 2014 8:17 AM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with 
> NoClassDefFoundError on Guava
>
> Just to double check - I looked at our own assembly jar and I confirmed that our Hadoop configuration class does use the correctly shaded version of Guava. My best guess here is that somehow a separate Hadoop library is ending up on the classpath, possible because Spark put it there somehow.
>
>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>> cd org/apache/hadoop/
>> javap -v Configuration | grep Precond
>
> Warning: Binary file Configuration contains 
> org.apache.hadoop.conf.Configuration
>
>    #497 = Utf8               org/spark-project/guava/common/base/Preconditions
>
>    #498 = Class              #497         //
> "org/spark-project/guava/common/base/Preconditions"
>
>    #502 = Methodref          #498.#501    //
> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLj
> ava/lang/Object;)V
>
>         12: invokestatic  #502                // Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
> va/lang/Object;)V
>
>         50: invokestatic  #502                // Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLja
> va/lang/Object;)V
>
> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell <pw...@gmail.com> wrote:
>> Hi Judy,
>>
>> Are you somehow modifying Spark's classpath to include jars from 
>> Hadoop and Hive that you have running on the machine? The issue seems 
>> to be that you are somehow including a version of Hadoop that 
>> references the original guava package. The Hadoop that is bundled in 
>> the Spark jars should not do this.
>>
>> - Patrick
>>
>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash 
>> <ju...@exchange.microsoft.com> wrote:
>>> Looks like a config issue. I ran spark-pi job and still failing with 
>>> the same guava error
>>>
>>> Command ran:
>>>
>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class 
>>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077 
>>> --executor-memory 1G --num-executors 1 
>>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>>
>>>
>>>
>>> Had used the same build steps on spark 1.1 and had no issue.
>>>
>>>
>>>
>>> From: Denny Lee [mailto:denny.g.lee@gmail.com]
>>> Sent: Tuesday, November 25, 2014 5:47 PM
>>> To: Judy Nash; Cheng Lian; user@spark.incubator.apache.org
>>>
>>>
>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> To determine if this is a Windows vs. other configuration, can you 
>>> just try to call the Spark-class.cmd SparkSubmit without actually 
>>> referencing the Hadoop or Thrift server classes?
>>>
>>>
>>>
>>>
>>>
>>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash 
>>> <ju...@exchange.microsoft.com>
>>> wrote:
>>>
>>> I traced the code and used the following to call:
>>>
>>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>> spark-internal --hiveconf hive.server2.thrift.port=10000
>>>
>>>
>>>
>>> The issue ended up to be much more fundamental however. Spark 
>>> doesn't work at all in configuration below. When open spark-shell, 
>>> it fails with the same ClassNotFound error.
>>>
>>> Now I wonder if this is a windows-only issue or the hive/Hadoop 
>>> configuration that is having this problem.
>>>
>>>
>>>
>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>> Sent: Tuesday, November 25, 2014 1:50 AM
>>>
>>>
>>> To: Judy Nash; user@spark.incubator.apache.org
>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> Oh so you're using Windows. What command are you using to start the 
>>> Thrift server then?
>>>
>>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>>
>>> Made progress but still blocked.
>>>
>>> After recompiling the code on cmd instead of PowerShell, now I can 
>>> see all 5 classes as you mentioned.
>>>
>>> However I am still seeing the same error as before. Anything else I 
>>> can check for?
>>>
>>>
>>>
>>> From: Judy Nash [mailto:judynash@exchange.microsoft.com]
>>> Sent: Monday, November 24, 2014 11:50 PM
>>> To: Cheng Lian; user@spark.incubator.apache.org
>>> Subject: RE: latest Spark 1.2 thrift server fail with 
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> This is what I got from jar tf:
>>>
>>> org/spark-project/guava/common/base/Preconditions.class
>>>
>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>
>>> com/clearspring/analytics/util/Preconditions.class
>>>
>>> parquet/Preconditions.class
>>>
>>>
>>>
>>> I seem to have the line that reported missing, but I am missing this file:
>>>
>>> com/google/inject/internal/util/$Preconditions.class
>>>
>>>
>>>
>>> Any suggestion on how to fix this?
>>>
>>> Very much appreciate the help as I am very new to Spark and open 
>>> source technologies.
>>>
>>>
>>>
>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>> Sent: Monday, November 24, 2014 8:24 PM
>>> To: Judy Nash; user@spark.incubator.apache.org
>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> Hm, I tried exactly the same commit and the build command locally, 
>>> but couldn't reproduce this.
>>>
>>> Usually this kind of errors are caused by classpath misconfiguration.
>>> Could you please try this to ensure corresponding Guava classes are 
>>> included in the assembly jar you built?
>>>
>>> jar tf
>>> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.
>>> jar | grep Preconditions
>>>
>>> On my machine I got these lines (the first line is the one reported 
>>> as missing in your case):
>>>
>>> org/spark-project/guava/common/base/Preconditions.class
>>>
>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>
>>> com/clearspring/analytics/util/Preconditions.class
>>>
>>> parquet/Preconditions.class
>>>
>>> com/google/inject/internal/util/$Preconditions.class
>>>
>>> On 11/25/14 6:25 AM, Judy Nash wrote:
>>>
>>> Thank you Cheng for responding.
>>>
>>>
>>> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>>>
>>> commit 6f70e0295572e3037660004797040e026e440dbd
>>>
>>> Author: zsxwing <zs...@gmail.com>
>>>
>>> Date:   Fri Nov 21 00:42:43 2014 -0800
>>>
>>>
>>>
>>>     [SPARK-4472][Shell] Print "Spark context available as sc." only 
>>> when SparkContext is created...
>>>
>>>
>>>
>>>     ... successfully
>>>
>>>
>>>
>>>     It's weird that printing "Spark context available as sc" when 
>>> creating SparkContext unsuccessfully.
>>>
>>>
>>>
>>> Let me know if you need anything else.
>>>
>>>
>>>
>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>> Sent: Friday, November 21, 2014 8:02 PM
>>> To: Judy Nash; user@spark.incubator.apache.org
>>> Subject: Re: latest Spark 1.2 thrift server fail with 
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> Hi Judy, could you please provide the commit SHA1 of the version 
>>> you're using? Thanks!
>>>
>>> On 11/22/14 11:05 AM, Judy Nash wrote:
>>>
>>> Hi,
>>>
>>>
>>>
>>> Thrift server is failing to start for me on latest spark 1.2 branch.
>>>
>>>
>>>
>>> I got the error below when I start thrift server.
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> com/google/common/bas
>>>
>>> e/Preconditions
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configu
>>> r
>>>
>>> ation.java:314)....
>>>
>>>
>>>
>>> Here is my setup:
>>>
>>> 1)      Latest spark 1.2 branch build
>>>
>>> 2)      Used build command:
>>>
>>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive 
>>> -Phive-thriftserver -DskipTests clean package
>>>
>>> 3)      Added hive-site.xml to \conf
>>>
>>> 4)      Version on the box: Hive 0.13, Hadoop 2.4
>>>
>>>
>>>
>>> Is this a real bug or am I doing something wrong?
>>>
>>>
>>>
>>> -----------------------------------
>>>
>>> Full Stacktrace:
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> com/google/common/bas
>>>
>>> e/Preconditions
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configu
>>> r
>>>
>>> ation.java:314)
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configu
>>> r
>>>
>>> ation.java:327)
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:409
>>> )
>>>
>>>
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoop
>>> U
>>>
>>> til.scala:82)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:
>>>
>>> 42)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scal
>>> a
>>>
>>> :202)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.sc
>>> a
>>>
>>> la)
>>>
>>>         at
>>> org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
>>>
>>>         at
>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
>>>
>>>         at
>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
>>>
>>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
>>>
>>>         at
>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
>>>
>>>         at
>>> org.apache.spark.SparkContext.<init>(SparkContext.scala:230)
>>>
>>>         at
>>> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
>>>
>>> scala:38)
>>>
>>>         at
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveT
>>> h
>>>
>>> riftServer2.scala:56)
>>>
>>>         at
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveTh
>>> r
>>>
>>> iftServer2.scala)
>>>
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>> Method)
>>>
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>>
>>> java:57)
>>>
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>> s
>>>
>>> sorImpl.java:43)
>>>
>>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> Caused by: java.lang.ClassNotFoundException:
>>> com.google.common.base.Precondition
>>>
>>> s
>>>
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>
>>>         at java.security.AccessController.doPrivileged(Native 
>>> Method)
>>>
>>>         at 
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>
>>>         at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>
>>>
>>>
>>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

Posted by Patrick Wendell <pw...@gmail.com>.
Thanks Judy. While this is not directly caused by a Spark issue, it is
likely other users will run into this. This is an unfortunate
consequence of the way that we've shaded Guava in this release, we
rely on byte code shading of Hadoop itself as well. And if the user
has their own Hadoop classes present it can cause issues.

On Sun, Nov 30, 2014 at 10:53 PM, Judy Nash
<ju...@exchange.microsoft.com> wrote:
> Thanks Patrick and Cheng for the suggestions.
>
> The issue was Hadoop common jar was added to a classpath. After I removed Hadoop common jar from both master and slave, I was able to bypass the error.
> This was caused by a local change, so no impact on the 1.2 release.
> -----Original Message-----
> From: Patrick Wendell [mailto:pwendell@gmail.com]
> Sent: Wednesday, November 26, 2014 8:17 AM
> To: Judy Nash
> Cc: Denny Lee; Cheng Lian; user@spark.incubator.apache.org
> Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava
>
> Just to double check - I looked at our own assembly jar and I confirmed that our Hadoop configuration class does use the correctly shaded version of Guava. My best guess here is that somehow a separate Hadoop library is ending up on the classpath, possible because Spark put it there somehow.
>
>> tar xvzf spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar
>> cd org/apache/hadoop/
>> javap -v Configuration | grep Precond
>
> Warning: Binary file Configuration contains org.apache.hadoop.conf.Configuration
>
>    #497 = Utf8               org/spark-project/guava/common/base/Preconditions
>
>    #498 = Class              #497         //
> "org/spark-project/guava/common/base/Preconditions"
>
>    #502 = Methodref          #498.#501    //
> "org/spark-project/guava/common/base/Preconditions".checkArgument:(ZLjava/lang/Object;)V
>
>         12: invokestatic  #502                // Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V
>
>         50: invokestatic  #502                // Method
> "org/spark-project/guava/common/base/Preconitions".checkArgument:(ZLjava/lang/Object;)V
>
> On Wed, Nov 26, 2014 at 11:08 AM, Patrick Wendell <pw...@gmail.com> wrote:
>> Hi Judy,
>>
>> Are you somehow modifying Spark's classpath to include jars from
>> Hadoop and Hive that you have running on the machine? The issue seems
>> to be that you are somehow including a version of Hadoop that
>> references the original guava package. The Hadoop that is bundled in
>> the Spark jars should not do this.
>>
>> - Patrick
>>
>> On Wed, Nov 26, 2014 at 1:45 AM, Judy Nash
>> <ju...@exchange.microsoft.com> wrote:
>>> Looks like a config issue. I ran spark-pi job and still failing with
>>> the same guava error
>>>
>>> Command ran:
>>>
>>> .\bin\spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.examples.SparkPi --master spark://headnodehost:7077
>>> --executor-memory 1G --num-executors 1
>>> .\lib\spark-examples-1.2.1-SNAPSHOT-hadoop2.4.0.jar 100
>>>
>>>
>>>
>>> Had used the same build steps on spark 1.1 and had no issue.
>>>
>>>
>>>
>>> From: Denny Lee [mailto:denny.g.lee@gmail.com]
>>> Sent: Tuesday, November 25, 2014 5:47 PM
>>> To: Judy Nash; Cheng Lian; user@spark.incubator.apache.org
>>>
>>>
>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> To determine if this is a Windows vs. other configuration, can you
>>> just try to call the Spark-class.cmd SparkSubmit without actually
>>> referencing the Hadoop or Thrift server classes?
>>>
>>>
>>>
>>>
>>>
>>> On Tue Nov 25 2014 at 5:42:09 PM Judy Nash
>>> <ju...@exchange.microsoft.com>
>>> wrote:
>>>
>>> I traced the code and used the following to call:
>>>
>>> Spark-class.cmd org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
>>> spark-internal --hiveconf hive.server2.thrift.port=10000
>>>
>>>
>>>
>>> The issue ended up to be much more fundamental however. Spark doesn't
>>> work at all in configuration below. When open spark-shell, it fails
>>> with the same ClassNotFound error.
>>>
>>> Now I wonder if this is a windows-only issue or the hive/Hadoop
>>> configuration that is having this problem.
>>>
>>>
>>>
>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>> Sent: Tuesday, November 25, 2014 1:50 AM
>>>
>>>
>>> To: Judy Nash; user@spark.incubator.apache.org
>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> Oh so you're using Windows. What command are you using to start the
>>> Thrift server then?
>>>
>>> On 11/25/14 4:25 PM, Judy Nash wrote:
>>>
>>> Made progress but still blocked.
>>>
>>> After recompiling the code on cmd instead of PowerShell, now I can
>>> see all 5 classes as you mentioned.
>>>
>>> However I am still seeing the same error as before. Anything else I
>>> can check for?
>>>
>>>
>>>
>>> From: Judy Nash [mailto:judynash@exchange.microsoft.com]
>>> Sent: Monday, November 24, 2014 11:50 PM
>>> To: Cheng Lian; user@spark.incubator.apache.org
>>> Subject: RE: latest Spark 1.2 thrift server fail with
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> This is what I got from jar tf:
>>>
>>> org/spark-project/guava/common/base/Preconditions.class
>>>
>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>
>>> com/clearspring/analytics/util/Preconditions.class
>>>
>>> parquet/Preconditions.class
>>>
>>>
>>>
>>> I seem to have the line that reported missing, but I am missing this file:
>>>
>>> com/google/inject/internal/util/$Preconditions.class
>>>
>>>
>>>
>>> Any suggestion on how to fix this?
>>>
>>> Very much appreciate the help as I am very new to Spark and open
>>> source technologies.
>>>
>>>
>>>
>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>> Sent: Monday, November 24, 2014 8:24 PM
>>> To: Judy Nash; user@spark.incubator.apache.org
>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> Hm, I tried exactly the same commit and the build command locally,
>>> but couldn't reproduce this.
>>>
>>> Usually this kind of errors are caused by classpath misconfiguration.
>>> Could you please try this to ensure corresponding Guava classes are
>>> included in the assembly jar you built?
>>>
>>> jar tf
>>> assembly/target/scala-2.10/spark-assembly-1.2.1-SNAPSHOT-hadoop2.4.0.
>>> jar | grep Preconditions
>>>
>>> On my machine I got these lines (the first line is the one reported
>>> as missing in your case):
>>>
>>> org/spark-project/guava/common/base/Preconditions.class
>>>
>>> org/spark-project/guava/common/math/MathPreconditions.class
>>>
>>> com/clearspring/analytics/util/Preconditions.class
>>>
>>> parquet/Preconditions.class
>>>
>>> com/google/inject/internal/util/$Preconditions.class
>>>
>>> On 11/25/14 6:25 AM, Judy Nash wrote:
>>>
>>> Thank you Cheng for responding.
>>>
>>>
>>> Here is the commit SHA1 on the 1.2 branch I saw this failure in:
>>>
>>> commit 6f70e0295572e3037660004797040e026e440dbd
>>>
>>> Author: zsxwing <zs...@gmail.com>
>>>
>>> Date:   Fri Nov 21 00:42:43 2014 -0800
>>>
>>>
>>>
>>>     [SPARK-4472][Shell] Print "Spark context available as sc." only
>>> when SparkContext is created...
>>>
>>>
>>>
>>>     ... successfully
>>>
>>>
>>>
>>>     It's weird that printing "Spark context available as sc" when
>>> creating SparkContext unsuccessfully.
>>>
>>>
>>>
>>> Let me know if you need anything else.
>>>
>>>
>>>
>>> From: Cheng Lian [mailto:lian.cs.zju@gmail.com]
>>> Sent: Friday, November 21, 2014 8:02 PM
>>> To: Judy Nash; user@spark.incubator.apache.org
>>> Subject: Re: latest Spark 1.2 thrift server fail with
>>> NoClassDefFoundError on Guava
>>>
>>>
>>>
>>> Hi Judy, could you please provide the commit SHA1 of the version
>>> you're using? Thanks!
>>>
>>> On 11/22/14 11:05 AM, Judy Nash wrote:
>>>
>>> Hi,
>>>
>>>
>>>
>>> Thrift server is failing to start for me on latest spark 1.2 branch.
>>>
>>>
>>>
>>> I got the error below when I start thrift server.
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> com/google/common/bas
>>>
>>> e/Preconditions
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>>
>>> ation.java:314)....
>>>
>>>
>>>
>>> Here is my setup:
>>>
>>> 1)      Latest spark 1.2 branch build
>>>
>>> 2)      Used build command:
>>>
>>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive
>>> -Phive-thriftserver -DskipTests clean package
>>>
>>> 3)      Added hive-site.xml to \conf
>>>
>>> 4)      Version on the box: Hive 0.13, Hadoop 2.4
>>>
>>>
>>>
>>> Is this a real bug or am I doing something wrong?
>>>
>>>
>>>
>>> -----------------------------------
>>>
>>> Full Stacktrace:
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> com/google/common/bas
>>>
>>> e/Preconditions
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>>
>>> ation.java:314)
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur
>>>
>>> ation.java:327)
>>>
>>>         at
>>> org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:409)
>>>
>>>
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU
>>>
>>> til.scala:82)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:
>>>
>>> 42)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala
>>>
>>> :202)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.sca
>>>
>>> la)
>>>
>>>         at
>>> org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
>>>
>>>         at
>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
>>>
>>>         at
>>> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
>>>
>>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
>>>
>>>         at
>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
>>>
>>>         at
>>> org.apache.spark.SparkContext.<init>(SparkContext.scala:230)
>>>
>>>         at
>>> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
>>>
>>> scala:38)
>>>
>>>         at
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh
>>>
>>> riftServer2.scala:56)
>>>
>>>         at
>>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr
>>>
>>> iftServer2.scala)
>>>
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>> Method)
>>>
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>>
>>> java:57)
>>>
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
>>>
>>> sorImpl.java:43)
>>>
>>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> Caused by: java.lang.ClassNotFoundException:
>>> com.google.common.base.Precondition
>>>
>>> s
>>>
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>
>>>         at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>
>>>
>>>
>>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org