You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by shahab <sh...@gmail.com> on 2014/11/11 12:13:15 UTC

Cassandra spark connector exception: "NoSuchMethodError: com.google.common.collect.Sets.newConcurrentHashSet()Ljava/util/Set;"

Hi,

I  have a spark application which uses Cassandra
"connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar" to load
data from Cassandra into spark.

Everything works fine in the local mode, when I run in my IDE. But when I
submit the application to be executed in standalone Spark server, I get the
following exception, (which is apparently related to Guava versions ???!).
Does any one know how to solve this?

I create a jar file of my spark application using assembly.bat, and the
followings is the dependencies I used:

I put the "connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.ja"
in the "lib/" folder of my eclipse project thats why it is not included in
the dependencies

libraryDependencies ++= Seq(

    "org.apache.spark"        %% "spark-catalyst"        % "1.1.0" %
"provided",

    "org.apache.cassandra" % "cassandra-all" % "2.0.9" intransitive(),

    "org.apache.cassandra" % "cassandra-thrift" % "2.0.9" intransitive(),

    "net.jpountz.lz4" % "lz4" % "1.2.0",

    "org.apache.thrift" % "libthrift" % "0.9.1" exclude("org.slf4j", "slf4j-
api") exclude("javax.servlet", "servlet-api"),

    "com.datastax.cassandra" % "cassandra-driver-core" % "2.0.4"
intransitive(),

    "org.apache.spark" %% "spark-core" % "1.1.0" % "provided"
exclude("org.apache.hadoop", "hadoop-core"),

    "org.apache.spark" %% "spark-streaming" % "1.1.0" % "provided",

    "org.apache.hadoop" % "hadoop-client" % "1.0.4" % "provided",

    "com.github.nscala-time" %% "nscala-time" % "1.0.0",

    "org.scalatest" %% "scalatest" % "1.9.1" % "test",

    "org.apache.spark" %% "spark-sql" % "1.1.0" %  "provided",

    "org.apache.spark" %% "spark-hive" % "1.1.0" % "provided",

    "org.json4s" %% "json4s-jackson" % "3.2.5",

    "junit" % "junit" % "4.8.1" % "test",

    "org.slf4j" % "slf4j-api" % "1.7.7",

    "org.slf4j" % "slf4j-simple" % "1.7.7",

    "org.clapper" %% "grizzled-slf4j" % "1.0.2",

    "log4j" % "log4j" % "1.2.17",

     "com.google.guava" % "guava"  % "16.0"

   )

best,

/Shahab
And this is the exception I get:

Exception in thread "main"
com.google.common.util.concurrent.ExecutionError:
java.lang.NoSuchMethodError:
com.google.common.collect.Sets.newConcurrentHashSet()Ljava/util/Set;
        at
com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2261)
        at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
        at
com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004)
        at
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
        at
org.apache.spark.sql.cassandra.CassandraCatalog.lookupRelation(CassandraCatalog.scala:39)
        at org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.org
$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(CassandraSQLContext.scala:60)
        at
org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:123)
        at
org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:123)
        at scala.Option.getOrElse(Option.scala:120)
        at
org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:123)
        at
org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.lookupRelation(CassandraSQLContext.scala:65)

Re: Cassandra spark connector exception: "NoSuchMethodError: com.google.common.collect.Sets.newConcurrentHashSet()Ljava/util/Set;"

Posted by shahab <sh...@gmail.com>.
Thanks Helena. I think I will wait for the new release and try it.

Again thanks,

/Shahab

On Tue, Nov 11, 2014 at 3:41 PM, Helena Edelson <helena.edelson@datastax.com
> wrote:

> Hi,
> It looks like you are building from master
> (spark-cassandra-connector-assembly-1.2.0).
> - Append this to your com.google.guava declaration: % "provided"
> - Be sure your version of the connector dependency is the same as the
> assembly build. For instance, if you are using 1.1.0-beta1, build your
> assembly with that vs master.
> - You can upgrade your version of cassandra if that is plausible for your
> deploy environment, to 2.1.0. Side note: we are releasing 1.1.0-beta2 today
> or tomorrow which allows usage of Cassandra 2.1.1 and fixes any guava issues
> - Make your version of cassandra server + dependencies match your
> cassandra driver version. You currently have 2.0.9 with 2.0.4
>
>
> - Helena
> @helenaedelson
>
>
> On Nov 11, 2014, at 6:13 AM, shahab <sh...@gmail.com> wrote:
>
> Hi,
>
> I  have a spark application which uses Cassandra
> "connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar" to load
> data from Cassandra into spark.
>
> Everything works fine in the local mode, when I run in my IDE. But when I
> submit the application to be executed in standalone Spark server, I get the
> following exception, (which is apparently related to Guava versions ???!).
> Does any one know how to solve this?
>
> I create a jar file of my spark application using assembly.bat, and the
> followings is the dependencies I used:
>
> I put the "connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.ja"
> in the "lib/" folder of my eclipse project thats why it is not included in
> the dependencies
>
> libraryDependencies ++= Seq(
>
>     "org.apache.spark"        %% "spark-catalyst"        % "1.1.0" %
> "provided",
>
>     "org.apache.cassandra" % "cassandra-all" % "2.0.9" intransitive(),
>
>     "org.apache.cassandra" % "cassandra-thrift" % "2.0.9" intransitive(),
>
>     "net.jpountz.lz4" % "lz4" % "1.2.0",
>
>     "org.apache.thrift" % "libthrift" % "0.9.1" exclude("org.slf4j",
> "slf4j-api") exclude("javax.servlet", "servlet-api"),
>
>     "com.datastax.cassandra" % "cassandra-driver-core" % "2.0.4"
> intransitive(),
>
>     "org.apache.spark" %% "spark-core" % "1.1.0" % "provided"
> exclude("org.apache.hadoop", "hadoop-core"),
>
>     "org.apache.spark" %% "spark-streaming" % "1.1.0" % "provided",
>
>     "org.apache.hadoop" % "hadoop-client" % "1.0.4" % "provided",
>
>     "com.github.nscala-time" %% "nscala-time" % "1.0.0",
>
>     "org.scalatest" %% "scalatest" % "1.9.1" % "test",
>
>     "org.apache.spark" %% "spark-sql" % "1.1.0" %  "provided",
>
>     "org.apache.spark" %% "spark-hive" % "1.1.0" % "provided",
>
>     "org.json4s" %% "json4s-jackson" % "3.2.5",
>
>     "junit" % "junit" % "4.8.1" % "test",
>
>     "org.slf4j" % "slf4j-api" % "1.7.7",
>
>     "org.slf4j" % "slf4j-simple" % "1.7.7",
>
>     "org.clapper" %% "grizzled-slf4j" % "1.0.2",
>
>     "log4j" % "log4j" % "1.2.17",
>
>      "com.google.guava" % "guava"  % "16.0"
>
>    )
>
> best,
>
> /Shahab
> And this is the exception I get:
>
> Exception in thread "main"
> com.google.common.util.concurrent.ExecutionError:
> java.lang.NoSuchMethodError:
> com.google.common.collect.Sets.newConcurrentHashSet()Ljava/util/Set;
>         at
> com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2261)
>         at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
>         at
> com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004)
>         at
> com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
>         at
> org.apache.spark.sql.cassandra.CassandraCatalog.lookupRelation(CassandraCatalog.scala:39)
>         at org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.org
> $apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(CassandraSQLContext.scala:60)
>         at
> org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:123)
>         at
> org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:123)
>         at scala.Option.getOrElse(Option.scala:120)
>         at
> org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:123)
>         at
> org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.lookupRelation(CassandraSQLContext.scala:65)
>
>
>

Re: Cassandra spark connector exception: "NoSuchMethodError: com.google.common.collect.Sets.newConcurrentHashSet()Ljava/util/Set;"

Posted by Helena Edelson <he...@datastax.com>.
Hi,
It looks like you are building from master (spark-cassandra-connector-assembly-1.2.0). 
- Append this to your com.google.guava declaration: % "provided"
- Be sure your version of the connector dependency is the same as the assembly build. For instance, if you are using 1.1.0-beta1, build your assembly with that vs master.
- You can upgrade your version of cassandra if that is plausible for your deploy environment, to 2.1.0. Side note: we are releasing 1.1.0-beta2 today or tomorrow which allows usage of Cassandra 2.1.1 and fixes any guava issues
- Make your version of cassandra server + dependencies match your cassandra driver version. You currently have 2.0.9 with 2.0.4
 

- Helena
@helenaedelson


On Nov 11, 2014, at 6:13 AM, shahab <sh...@gmail.com> wrote:

> Hi,
> 
> I  have a spark application which uses Cassandra "connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar" to load data from Cassandra into spark.
> 
> Everything works fine in the local mode, when I run in my IDE. But when I submit the application to be executed in standalone Spark server, I get the following exception, (which is apparently related to Guava versions ???!). Does any one know how to solve this?
> 
> I create a jar file of my spark application using assembly.bat, and the followings is the dependencies I used:
> 
> I put the "connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.ja" in the "lib/" folder of my eclipse project thats why it is not included in the dependencies
> libraryDependencies ++= Seq(
> 
>     "org.apache.spark"        %% "spark-catalyst"        % "1.1.0" % "provided",
> 
>     "org.apache.cassandra" % "cassandra-all" % "2.0.9" intransitive(),
> 
>     "org.apache.cassandra" % "cassandra-thrift" % "2.0.9" intransitive(),
> 
>     "net.jpountz.lz4" % "lz4" % "1.2.0",
> 
>     "org.apache.thrift" % "libthrift" % "0.9.1" exclude("org.slf4j", "slf4j-api") exclude("javax.servlet", "servlet-api"),
> 
>     "com.datastax.cassandra" % "cassandra-driver-core" % "2.0.4" intransitive(),
> 
>     "org.apache.spark" %% "spark-core" % "1.1.0" % "provided" exclude("org.apache.hadoop", "hadoop-core"),
> 
>     "org.apache.spark" %% "spark-streaming" % "1.1.0" % "provided",
> 
>     "org.apache.hadoop" % "hadoop-client" % "1.0.4" % "provided",
> 
>     "com.github.nscala-time" %% "nscala-time" % "1.0.0",
> 
>     "org.scalatest" %% "scalatest" % "1.9.1" % "test",
> 
>     "org.apache.spark" %% "spark-sql" % "1.1.0" %  "provided",
> 
>     "org.apache.spark" %% "spark-hive" % "1.1.0" % "provided",
> 
>     "org.json4s" %% "json4s-jackson" % "3.2.5",
> 
>     "junit" % "junit" % "4.8.1" % "test",
> 
>     "org.slf4j" % "slf4j-api" % "1.7.7",
> 
>     "org.slf4j" % "slf4j-simple" % "1.7.7",
> 
>     "org.clapper" %% "grizzled-slf4j" % "1.0.2",
> 
>     "log4j" % "log4j" % "1.2.17",
> 
>      "com.google.guava" % "guava"  % "16.0"
> 
> 
>    )
> 
> 
> best,
> 
> /Shahab
> And this is the exception I get:
> 
> Exception in thread "main" com.google.common.util.concurrent.ExecutionError: java.lang.NoSuchMethodError: com.google.common.collect.Sets.newConcurrentHashSet()Ljava/util/Set;
>         at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2261)
>         at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
>         at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004)
>         at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
>         at org.apache.spark.sql.cassandra.CassandraCatalog.lookupRelation(CassandraCatalog.scala:39)
>         at org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(CassandraSQLContext.scala:60)
>         at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:123)
>         at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:123)
>         at scala.Option.getOrElse(Option.scala:120)
>         at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:123)
>         at org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.lookupRelation(CassandraSQLContext.scala:65)