You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by wwilkins <ww...@expedia.com> on 2014/09/10 17:42:10 UTC

Cassandra connector

Hi,
I am having difficulty getting the Cassandra connector running within the spark shell.

My jars looks like:
[wwilkins@phel-spark-001 logs]$ ls -altr /opt/connector/
total 14588
drwxr-xr-x. 5 root root    4096 Sep  9 22:15 ..
-rw-r--r--  1 root root  242809 Sep  9 22:20 spark-cassandra-connector-master.zip
-rw-r--r--  1 root root  541332 Sep  9 22:20 cassandra-driver-core-2.0.3.jar
-rw-r--r--  1 root root 1855685 Sep  9 22:20 cassandra-thrift-2.0.9.jar
-rw-r--r--  1 root root   30085 Sep  9 22:20 commons-codec-1.2.jar
-rw-r--r--  1 root root  315805 Sep  9 22:20 commons-lang3-3.1.jar
-rw-r--r--  1 root root   60686 Sep  9 22:20 commons-logging-1.1.1.jar
-rw-r--r--  1 root root 2228009 Sep  9 22:20 guava-16.0.1.jar
-rw-r--r--  1 root root  433368 Sep  9 22:20 httpclient-4.2.5.jar
-rw-r--r--  1 root root  227275 Sep  9 22:20 httpcore-4.2.4.jar
-rw-r--r--  1 root root 1222059 Sep  9 22:20 ivy-2.3.0.jar.bak
-rw-r--r--  1 root root   38460 Sep  9 22:20 joda-convert-1.2.jar.bak
-rw-r--r--  1 root root   98818 Sep  9 22:20 joda-convert-1.6.jar
-rw-r--r--  1 root root  581571 Sep  9 22:20 joda-time-2.3.jar
-rw-r--r--  1 root root  217053 Sep  9 22:20 libthrift-0.9.1.jar
-rw-r--r--  1 root root     618 Sep  9 22:20 log4j.properties
-rw-r--r--  1 root root  165505 Sep  9 22:20 lz4-1.2.0.jar
-rw-r--r--  1 root root   85448 Sep  9 22:20 metrics-core-3.0.2.jar
-rw-r--r--  1 root root 1231993 Sep  9 22:20 netty-3.9.0.Final.jar
-rw-r--r--  1 root root   26083 Sep  9 22:20 slf4j-api-1.7.2.jar.bak
-rw-r--r--  1 root root   26084 Sep  9 22:20 slf4j-api-1.7.5.jar
-rw-r--r--  1 root root 1251514 Sep  9 22:20 snappy-java-1.0.5.jar
-rw-r--r--  1 root root  776782 Sep  9 22:20 spark-cassandra-connector_2.10-1.0.0-beta1.jar
-rw-r--r--  1 root root  997458 Sep  9 22:20 spark-cassandra-connector_2.10-1.0.0-SNAPSHOT.jar.bak3
-rwxr--r--  1 root root 1113208 Sep  9 22:20 spark-cassandra-connector_2.10-1.1.0-SNAPSHOT.jar.bak
-rw-r--r--  1 root root 1111804 Sep  9 22:20 spark-cassandra-connector_2.10-1.1.0-SNAPSHOT.jar.bak2


I launch the shell with this command:
/data/spark/bin/spark-shell --driver-class-path $(echo /opt/connector/*.jar |sed 's/ /:/g')


I run these commands:
sc.stop
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._

val conf = new SparkConf()
conf.set("spark.cassandra.connection.host", "10.208.59.164")
val sc = new SparkContext(conf)
val table = sc.cassandraTable("retail", "ordercf")


And I get this problem:
scala> val table = sc.cassandraTable("retail", "ordercf")
java.lang.AbstractMethodError
        at org.apache.spark.Logging$class.log(Logging.scala:52)
        at com.datastax.spark.connector.cql.CassandraConnector$.log(CassandraConnector.scala:145)
        at org.apache.spark.Logging$class.logDebug(Logging.scala:63)
        at com.datastax.spark.connector.cql.CassandraConnector$.logDebug(CassandraConnector.scala:145)
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createCluster(CassandraConnector.scala:155)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:152)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$4.apply(CassandraConnector.scala:152)
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:36)
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:61)
        at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:72)
        at com.datastax.spark.connector.cql.Schema.keyspaces$lzycompute(Schema.scala:111)
        at com.datastax.spark.connector.cql.Schema.keyspaces(Schema.scala:110)
        at com.datastax.spark.connector.cql.Schema.tables$lzycompute(Schema.scala:123)
        at com.datastax.spark.connector.cql.Schema.tables(Schema.scala:122)
        at com.datastax.spark.connector.rdd.CassandraRDD.tableDef$lzycompute(CassandraRDD.scala:196)
        at com.datastax.spark.connector.rdd.CassandraRDD.tableDef(CassandraRDD.scala:195)
        at com.datastax.spark.connector.rdd.CassandraRDD.<init>(CassandraRDD.scala:202)
        at com.datastax.spark.connector.package$SparkContextFunctions.cassandraTable(package.scala:94)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
        at $iwC$$iwC$$iwC.<init>(<console>:35)
        at $iwC$$iwC.<init>(<console>:37)
        at $iwC.<init>(<console>:39)
        at <init>(<console>:41)
        at .<init>(<console>:45)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:610)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:814)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:859)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:771)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:616)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:624)
        at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:629)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:954)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:902)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:997)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:317)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:73)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



Any insight would be appreciated.

Thanks,

Wade Wilkins

 
 






--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cassandra-connector-tp13896.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

RE: Cassandra connector

Posted by Wade Wilkins <ww...@expedia.com>.
Thank you!

I am running spark 1.0 but, your suggestion worked for me.
I rem'ed out all 
//logDebug
In both
CassandraConnector.scala
and
Schema.scala

I am moving again. 

Regards,
Wade

-----Original Message-----
From: gtinside [mailto:gtinside@gmail.com] 
Sent: Wednesday, September 10, 2014 8:49 AM
To: user@spark.incubator.apache.org
Subject: Re: Cassandra connector

Are you using spark 1.1 ? If yes you would have to update the datastax cassandra connector code and remove ref to log methods from CassandraConnector.scala

Regards,
Gaurav



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cassandra-connector-tp13896p13897.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail: user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Cassandra connector

Posted by gtinside <gt...@gmail.com>.
Are you using spark 1.1 ? If yes you would have to update the datastax
cassandra connector code and remove ref to log methods from
CassandraConnector.scala

Regards,
Gaurav



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cassandra-connector-tp13896p13897.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org