You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by satish chandra j <js...@gmail.com> on 2015/08/13 16:29:25 UTC
Re: saveToCassandra not working in Spark Job but works in Spark Shell
HI,
Please let me know if I am missing anything in the below mail, to get the
issue fixed
Regards,
Satish Chandra
On Wed, Aug 12, 2015 at 6:59 PM, satish chandra j <js...@gmail.com>
wrote:
> HI,
>
> The below mentioned code is working very well fine in Spark Shell but when
> the same is placed in Spark Application it is errors as mentioned below:
>
> *Exception in thread "main" java.lang.NoSuchMethodError:
> com.datastax.spark.connector.package$.toRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;)Lcom/datastax/spark/connector/RDDFunctions*
>
>
> *Code:*
>
> *import* *org.apache*.spark.SparkContext
>
> *import* *org.apache*.spark.SparkContext._
>
> *import* *org.apache*.spark.SparkConf
>
> *import* *org.apache*.spark.rdd.JdbcRDD
>
> *import* *com.datastax*.spark.connector._
>
> *import* com.datastax.spark.connector.cql.CassandraConnector
>
> *import* com.datastax.bdp.spark.DseSparkConfHelper._
>
> *import* java.sql.{Connection, DriverManager, ResultSet,
> PreparedStatement, SQLException, Statement}
>
> *object* HelloWorld {
>
> *def* main(args: Array[String]) {
>
> *def* createSparkContext() = {
>
> *val* conf = *new* SparkConf().set(
> "spark.cassandra.connection.host", "10.246.43.15")
>
> .setAppName("First Spark App")
>
> .setMaster("local")
>
> .set("cassandra.username", "username")
>
> .set("cassandra.password", "password")
>
> .forDse
>
> *new* SparkContext(conf)
>
> }
>
>
>
> *val* sc = createSparkContext()
>
> *val* user="user"
>
> *val** pass=*"password"
>
> Class.forName("org.postgresql.Driver").newInstance
>
> *val* url = "jdbc:postgresql://gptester:5432/db_test"
>
> *val* myRDD27 = *new* JdbcRDD( sc, ()=>
> DriverManager.getConnection(url,user,pass),"select * from
> wmax_vmax.arm_typ_txt LIMIT ? OFFSET ?",5,0,1,(r: ResultSet) => (r.getInt(
> "alarm_type_code"),r.getString("language_code"),r.getString(
> "alrm_type_cd_desc")))
>
> * myRDD27.saveToCassandra(*
> *"keyspace","arm_typ_txt",SomeColumns("alarm_type_code","language_code","alrm_type_cd_desc"))*
>
> println(myRDD27.count())
>
> println(myRDD27.first)
>
> sc.stop()
>
> sys.exit()
>
>
>
> }
>
> }
>
>
>
> *Command: *
> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
> --jars /home/missingmerch/postgresql-9.4-1201.jdbc41.jar etl-0.0.
> 1-SNAPSHOT.jar
>
> Please let me know if any solutions for this issue
>
> Regards,
> Satish Chandra
>
Re: saveToCassandra not working in Spark Job but works in Spark Shell
Posted by satish chandra j <js...@gmail.com>.
Hi Akhil,
Which jar version is conflicting and what needs to be done for the fix
Regards,
Satish Chandra
On Fri, Aug 14, 2015 at 2:44 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:
> Looks like a jar version conflict to me.
>
> Thanks
> Best Regards
>
> On Thu, Aug 13, 2015 at 7:59 PM, satish chandra j <
> jsatishchandra@gmail.com> wrote:
>
>> HI,
>> Please let me know if I am missing anything in the below mail, to get the
>> issue fixed
>>
>> Regards,
>> Satish Chandra
>>
>> On Wed, Aug 12, 2015 at 6:59 PM, satish chandra j <
>> jsatishchandra@gmail.com> wrote:
>>
>>> HI,
>>>
>>> The below mentioned code is working very well fine in Spark Shell but
>>> when the same is placed in Spark Application it is errors as mentioned
>>> below:
>>>
>>> *Exception in thread "main" java.lang.NoSuchMethodError:
>>> com.datastax.spark.connector.package$.toRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;)Lcom/datastax/spark/connector/RDDFunctions*
>>>
>>>
>>> *Code:*
>>>
>>> *import* *org.apache*.spark.SparkContext
>>>
>>> *import* *org.apache*.spark.SparkContext._
>>>
>>> *import* *org.apache*.spark.SparkConf
>>>
>>> *import* *org.apache*.spark.rdd.JdbcRDD
>>>
>>> *import* *com.datastax*.spark.connector._
>>>
>>> *import* com.datastax.spark.connector.cql.CassandraConnector
>>>
>>> *import* com.datastax.bdp.spark.DseSparkConfHelper._
>>>
>>> *import* java.sql.{Connection, DriverManager, ResultSet,
>>> PreparedStatement, SQLException, Statement}
>>>
>>> *object* HelloWorld {
>>>
>>> *def* main(args: Array[String]) {
>>>
>>> *def* createSparkContext() = {
>>>
>>> *val* conf = *new* SparkConf().set(
>>> "spark.cassandra.connection.host", "10.246.43.15")
>>>
>>> .setAppName("First Spark App")
>>>
>>> .setMaster("local")
>>>
>>> .set("cassandra.username", "username")
>>>
>>> .set("cassandra.password", "password")
>>>
>>> .forDse
>>>
>>> *new* SparkContext(conf)
>>>
>>> }
>>>
>>>
>>>
>>> *val* sc = createSparkContext()
>>>
>>> *val* user="user"
>>>
>>> *val** pass=*"password"
>>>
>>> Class.forName("org.postgresql.Driver").newInstance
>>>
>>> *val* url = "jdbc:postgresql://gptester:5432/db_test"
>>>
>>> *val* myRDD27 = *new* JdbcRDD( sc, ()=>
>>> DriverManager.getConnection(url,user,pass),"select * from
>>> wmax_vmax.arm_typ_txt LIMIT ? OFFSET ?",5,0,1,(r: ResultSet) =>
>>> (r.getInt("alarm_type_code"),r.getString("language_code"),r.getString(
>>> "alrm_type_cd_desc")))
>>>
>>> * myRDD27.saveToCassandra(*
>>> *"keyspace","arm_typ_txt",SomeColumns("alarm_type_code","language_code","alrm_type_cd_desc"))*
>>>
>>> println(myRDD27.count())
>>>
>>> println(myRDD27.first)
>>>
>>> sc.stop()
>>>
>>> sys.exit()
>>>
>>>
>>>
>>> }
>>>
>>> }
>>>
>>>
>>>
>>> *Command: *
>>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>>> --jars /home/missingmerch/postgresql-9.4-1201.jdbc41.jar etl-0.0.
>>> 1-SNAPSHOT.jar
>>>
>>> Please let me know if any solutions for this issue
>>>
>>> Regards,
>>> Satish Chandra
>>>
>>
>>
>
Re: saveToCassandra not working in Spark Job but works in Spark Shell
Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Looks like a jar version conflict to me.
Thanks
Best Regards
On Thu, Aug 13, 2015 at 7:59 PM, satish chandra j <js...@gmail.com>
wrote:
> HI,
> Please let me know if I am missing anything in the below mail, to get the
> issue fixed
>
> Regards,
> Satish Chandra
>
> On Wed, Aug 12, 2015 at 6:59 PM, satish chandra j <
> jsatishchandra@gmail.com> wrote:
>
>> HI,
>>
>> The below mentioned code is working very well fine in Spark Shell but
>> when the same is placed in Spark Application it is errors as mentioned
>> below:
>>
>> *Exception in thread "main" java.lang.NoSuchMethodError:
>> com.datastax.spark.connector.package$.toRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;)Lcom/datastax/spark/connector/RDDFunctions*
>>
>>
>> *Code:*
>>
>> *import* *org.apache*.spark.SparkContext
>>
>> *import* *org.apache*.spark.SparkContext._
>>
>> *import* *org.apache*.spark.SparkConf
>>
>> *import* *org.apache*.spark.rdd.JdbcRDD
>>
>> *import* *com.datastax*.spark.connector._
>>
>> *import* com.datastax.spark.connector.cql.CassandraConnector
>>
>> *import* com.datastax.bdp.spark.DseSparkConfHelper._
>>
>> *import* java.sql.{Connection, DriverManager, ResultSet,
>> PreparedStatement, SQLException, Statement}
>>
>> *object* HelloWorld {
>>
>> *def* main(args: Array[String]) {
>>
>> *def* createSparkContext() = {
>>
>> *val* conf = *new* SparkConf().set(
>> "spark.cassandra.connection.host", "10.246.43.15")
>>
>> .setAppName("First Spark App")
>>
>> .setMaster("local")
>>
>> .set("cassandra.username", "username")
>>
>> .set("cassandra.password", "password")
>>
>> .forDse
>>
>> *new* SparkContext(conf)
>>
>> }
>>
>>
>>
>> *val* sc = createSparkContext()
>>
>> *val* user="user"
>>
>> *val** pass=*"password"
>>
>> Class.forName("org.postgresql.Driver").newInstance
>>
>> *val* url = "jdbc:postgresql://gptester:5432/db_test"
>>
>> *val* myRDD27 = *new* JdbcRDD( sc, ()=>
>> DriverManager.getConnection(url,user,pass),"select * from
>> wmax_vmax.arm_typ_txt LIMIT ? OFFSET ?",5,0,1,(r: ResultSet) =>
>> (r.getInt("alarm_type_code"),r.getString("language_code"),r.getString(
>> "alrm_type_cd_desc")))
>>
>> * myRDD27.saveToCassandra(*
>> *"keyspace","arm_typ_txt",SomeColumns("alarm_type_code","language_code","alrm_type_cd_desc"))*
>>
>> println(myRDD27.count())
>>
>> println(myRDD27.first)
>>
>> sc.stop()
>>
>> sys.exit()
>>
>>
>>
>> }
>>
>> }
>>
>>
>>
>> *Command: *
>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>> --jars /home/missingmerch/postgresql-9.4-1201.jdbc41.jar etl-0.0.
>> 1-SNAPSHOT.jar
>>
>> Please let me know if any solutions for this issue
>>
>> Regards,
>> Satish Chandra
>>
>
>