You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by satish chandra j <js...@gmail.com> on 2015/08/11 12:29:17 UTC

dse spark-submit multiple jars issue

*HI,*

Please let me know if i am missing anything in the command below

*Command:*

dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
///home/missingmerch/dse.jar
///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
///home/missingmerch/etl-0.0.1-SNAPSHOT.jar


*Error:*

*java.lang.ClassNotFoundException: HelloWorld*

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

        at java.lang.Class.forName0(Native Method)

        at java.lang.Class.forName(Class.java:270)

        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)

        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


I understand the way I am giving multiple jar file paths in the command is
an issue, please provide an appropriate format for providing multiple jars
in the command




Thanks for support


Satish Chandra

Re: dse spark-submit multiple jars issue

Posted by Andrew Or <an...@databricks.com>.
Hi Satish,

The problem is that `--jars` accepts a comma-delimited list of jars! E.g.

spark-submit ... --jars lib1.jar,lib2.jar,lib3.jar main.jar

where main.jar is your main application jar (the one that starts a
SparkContext), and lib*.jar refer to additional libraries that your main
application jar uses.

-Andrew

2015-08-13 3:22 GMT-07:00 Javier Domingo Cansino <ja...@fon.com>:

> Please notice that 'jars: null'
>
> I don't know why you put ///..... but I would propose you just put normal
> absolute paths.
>
> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
> --jars /home/missingmerch/postgresql-9.4-1201.jdbc41.jar
> /home/missingmerch/dse.jar
> /home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
> /home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>
> Hope this is helpful!
>
> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
> Development Engineer+34 946545847Skype: javier.domingo.fonAll information
> in this email is confidential <http://corp.fon.com/legal/email-disclaimer>
>
> On Tue, Aug 11, 2015 at 3:42 PM, satish chandra j <
> jsatishchandra@gmail.com> wrote:
>
>> HI,
>>
>> Please find the log details below:
>>
>>
>> dse spark-submit --verbose --master local --class HelloWorld
>> etl-0.0.1-SNAPSHOT.jar --jars
>> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>> file:/home/missingmerch/dse.jar
>> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>
>> Using properties file: /etc/dse/spark/spark-defaults.conf
>>
>> Adding default property:
>> spark.cassandra.connection.factory=com.datastax.bdp.spark.DseCassandraConnectionFactory
>>
>> Adding default property: spark.ssl.keyStore=.keystore
>>
>> Adding default property: spark.ssl.enabled=false
>>
>> Adding default property: spark.ssl.trustStore=.truststore
>>
>> Adding default property:
>> spark.cassandra.auth.conf.factory=com.datastax.bdp.spark.DseAuthConfFactory
>>
>> Adding default property: spark.ssl.keyPassword=cassandra
>>
>> Adding default property: spark.ssl.keyStorePassword=cassandra
>>
>> Adding default property: spark.ssl.protocol=TLS
>>
>> Adding default property: spark.ssl.useNodeLocalConf=true
>>
>> Adding default property: spark.ssl.trustStorePassword=cassandra
>>
>> Adding default property:
>> spark.ssl.enabledAlgorithms=TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA
>>
>> Parsed arguments:
>>
>>   master                  local
>>
>>   deployMode              null
>>
>>   executorMemory          null
>>
>>   executorCores           null
>>
>>   totalExecutorCores      null
>>
>>   propertiesFile          /etc/dse/spark/spark-defaults.conf
>>
>>   driverMemory            512M
>>
>>   driverCores             null
>>
>>   driverExtraClassPath    null
>>
>>   driverExtraLibraryPath  null
>>
>>   driverExtraJavaOptions  -Dcassandra.username=missingmerch
>> -Dcassandra.password=STMbrjrlb -XX:MaxPermSize=256M
>>
>>   supervise               false
>>
>>   queue                   null
>>
>>   numExecutors            null
>>
>>   files                   null
>>
>>   pyFiles                 null
>>
>>   archives                null
>>
>>   mainClass               HelloWorld
>>
>>   primaryResource         file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>
>>   name                    HelloWorld
>>
>>   childArgs               [--jars
>> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>> file:/home/missingmerch/dse.jar
>> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar]
>>
>>   jars                    null
>>
>>   verbose                 true
>>
>>
>>
>> Spark properties used, including those specified through
>>
>> --conf and those from the properties file
>> /etc/dse/spark/spark-defaults.conf:
>>
>>   spark.cassandra.connection.factory ->
>> com.datastax.bdp.spark.DseCassandraConnectionFactory
>>
>>   spark.ssl.useNodeLocalConf -> true
>>
>>   spark.ssl.enabled -> false
>>
>>   spark.executor.extraJavaOptions -> -XX:MaxPermSize=256M
>>
>>   spark.ssl.keyStore -> .keystore
>>
>>   spark.ssl.trustStore -> .truststore
>>
>>   spark.ssl.trustStorePassword -> cassandra
>>
>>   spark.ssl.enabledAlgorithms ->
>> TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA
>>
>>   spark.cassandra.auth.conf.factory ->
>> com.datastax.bdp.spark.DseAuthConfFactory
>>
>>   spark.ssl.protocol -> TLS
>>
>>   spark.ssl.keyPassword -> cassandra
>>
>>   spark.ssl.keyStorePassword -> cassandra
>>
>>
>>
>>
>>
>> Main class:
>>
>> HelloWorld
>>
>> Arguments:
>>
>> --jars
>>
>> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>
>> file:/home/missingmerch/dse.jar
>>
>> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>
>> System properties:
>>
>> spark.cassandra.connection.factory ->
>> com.datastax.bdp.spark.DseCassandraConnectionFactory
>>
>> spark.driver.memory -> 512M
>>
>> spark.ssl.useNodeLocalConf -> true
>>
>> spark.ssl.enabled -> false
>>
>> SPARK_SUBMIT -> true
>>
>> spark.executor.extraJavaOptions -> -XX:MaxPermSize=256M
>>
>> spark.app.name -> HelloWorld
>>
>> spark.ssl.enabledAlgorithms ->
>> TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA
>>
>> spark.ssl.trustStorePassword -> cassandra
>>
>> spark.driver.extraJavaOptions -> -Dcassandra.username=missingmerch
>> -Dcassandra.password=STMbrjrlb -XX:MaxPermSize=256M
>>
>> spark.ssl.keyStore -> .keystore
>>
>> spark.ssl.trustStore -> .truststore
>>
>> spark.jars -> file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>
>> spark.cassandra.auth.conf.factory ->
>> com.datastax.bdp.spark.DseAuthConfFactory
>>
>> spark.master -> local
>>
>> spark.ssl.protocol -> TLS
>>
>> spark.ssl.keyPassword -> cassandra
>>
>> spark.ssl.keyStorePassword -> cassandra
>>
>> Classpath elements:
>>
>> file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>
>>
>>
>>
>>
>> WARN  2015-08-11 08:23:25 org.apache.spark.util.Utils: Service 'SparkUI'
>> could not bind on port 4040. Attempting port 4041.
>>
>> Exception in thread "main" java.lang.ClassNotFoundException:
>> org.postgresql.Driver
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>
>>         at java.security.AccessController.doPrivileged(Native Method)
>>
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>
>>         at java.lang.Class.forName0(Native Method)
>>
>>         at java.lang.Class.forName(Class.java:190)
>>
>>         at HelloWorld$.main(HelloWorld.scala:26)
>>
>>         at HelloWorld.main(HelloWorld.scala)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>> Regards,
>>
>> Satish Chandra J
>>
>> On Tue, Aug 11, 2015 at 6:15 PM, Javier Domingo Cansino <
>> javier.domingo@fon.com> wrote:
>>
>>> use --verbose, it might give you some insights on what0s happening,....
>>>
>>> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
>>> Development Engineer+34 946545847Skype: javier.domingo.fonAll
>>> information in this email is confidential
>>> <http://corp.fon.com/legal/email-disclaimer>
>>>
>>> On Tue, Aug 11, 2015 at 2:44 PM, satish chandra j <
>>> jsatishchandra@gmail.com> wrote:
>>>
>>>> HI ,
>>>> I have used --jars option as well, please find the command below
>>>>
>>>> *Command:*
>>>>
>>>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>>>> *--jars* ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>>> ///home/missingmerch/dse.jar
>>>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>>>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>>>
>>>> Regards,
>>>> Satish
>>>>
>>>> On Tue, Aug 11, 2015 at 4:08 PM, Javier Domingo Cansino <
>>>> javier.domingo@fon.com> wrote:
>>>>
>>>>> I have no real idea (not java user), but have you tried with the
>>>>> --jars option?
>>>>>
>>>>>
>>>>> http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
>>>>>
>>>>> AFAIK, you are currently submitting the jar names as arguments to the
>>>>> called Class instead of the jars themselves
>>>>>
>>>>> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
>>>>> Development Engineer+34 946545847Skype: javier.domingo.fonAll
>>>>> information in this email is confidential
>>>>> <http://corp.fon.com/legal/email-disclaimer>
>>>>>
>>>>> On Tue, Aug 11, 2015 at 12:29 PM, satish chandra j <
>>>>> jsatishchandra@gmail.com> wrote:
>>>>>
>>>>>>
>>>>>> *HI,*
>>>>>>
>>>>>> Please let me know if i am missing anything in the command below
>>>>>>
>>>>>> *Command:*
>>>>>>
>>>>>> dse spark-submit --master spark://10.246.43.15:7077 --class
>>>>>> HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>>>>> ///home/missingmerch/dse.jar
>>>>>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>>>>>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>>>>>
>>>>>>
>>>>>> *Error:*
>>>>>>
>>>>>> *java.lang.ClassNotFoundException: HelloWorld*
>>>>>>
>>>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>>>
>>>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>>>
>>>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>>>
>>>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>>>
>>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>>>
>>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>>>
>>>>>>         at java.lang.Class.forName0(Native Method)
>>>>>>
>>>>>>         at java.lang.Class.forName(Class.java:270)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>>>>
>>>>>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>>>
>>>>>>
>>>>>> I understand the way I am giving multiple jar file paths in the
>>>>>> command is an issue, please provide an appropriate format for providing
>>>>>> multiple jars in the command
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> Thanks for support
>>>>>>
>>>>>>
>>>>>> Satish Chandra
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: dse spark-submit multiple jars issue

Posted by Javier Domingo Cansino <ja...@fon.com>.
Please notice that 'jars: null'

I don't know why you put ///..... but I would propose you just put normal
absolute paths.

dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
--jars /home/missingmerch/postgresql-9.4-1201.jdbc41.jar
/home/missingmerch/dse.jar
/home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
/home/missingmerch/etl-0.0.1-SNAPSHOT.jar

Hope this is helpful!

[image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
Development Engineer+34 946545847Skype: javier.domingo.fonAll information
in this email is confidential <http://corp.fon.com/legal/email-disclaimer>

On Tue, Aug 11, 2015 at 3:42 PM, satish chandra j <js...@gmail.com>
wrote:

> HI,
>
> Please find the log details below:
>
>
> dse spark-submit --verbose --master local --class HelloWorld
> etl-0.0.1-SNAPSHOT.jar --jars
> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
> file:/home/missingmerch/dse.jar
> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>
> Using properties file: /etc/dse/spark/spark-defaults.conf
>
> Adding default property:
> spark.cassandra.connection.factory=com.datastax.bdp.spark.DseCassandraConnectionFactory
>
> Adding default property: spark.ssl.keyStore=.keystore
>
> Adding default property: spark.ssl.enabled=false
>
> Adding default property: spark.ssl.trustStore=.truststore
>
> Adding default property:
> spark.cassandra.auth.conf.factory=com.datastax.bdp.spark.DseAuthConfFactory
>
> Adding default property: spark.ssl.keyPassword=cassandra
>
> Adding default property: spark.ssl.keyStorePassword=cassandra
>
> Adding default property: spark.ssl.protocol=TLS
>
> Adding default property: spark.ssl.useNodeLocalConf=true
>
> Adding default property: spark.ssl.trustStorePassword=cassandra
>
> Adding default property:
> spark.ssl.enabledAlgorithms=TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA
>
> Parsed arguments:
>
>   master                  local
>
>   deployMode              null
>
>   executorMemory          null
>
>   executorCores           null
>
>   totalExecutorCores      null
>
>   propertiesFile          /etc/dse/spark/spark-defaults.conf
>
>   driverMemory            512M
>
>   driverCores             null
>
>   driverExtraClassPath    null
>
>   driverExtraLibraryPath  null
>
>   driverExtraJavaOptions  -Dcassandra.username=missingmerch
> -Dcassandra.password=STMbrjrlb -XX:MaxPermSize=256M
>
>   supervise               false
>
>   queue                   null
>
>   numExecutors            null
>
>   files                   null
>
>   pyFiles                 null
>
>   archives                null
>
>   mainClass               HelloWorld
>
>   primaryResource         file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>
>   name                    HelloWorld
>
>   childArgs               [--jars
> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
> file:/home/missingmerch/dse.jar
> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar]
>
>   jars                    null
>
>   verbose                 true
>
>
>
> Spark properties used, including those specified through
>
> --conf and those from the properties file
> /etc/dse/spark/spark-defaults.conf:
>
>   spark.cassandra.connection.factory ->
> com.datastax.bdp.spark.DseCassandraConnectionFactory
>
>   spark.ssl.useNodeLocalConf -> true
>
>   spark.ssl.enabled -> false
>
>   spark.executor.extraJavaOptions -> -XX:MaxPermSize=256M
>
>   spark.ssl.keyStore -> .keystore
>
>   spark.ssl.trustStore -> .truststore
>
>   spark.ssl.trustStorePassword -> cassandra
>
>   spark.ssl.enabledAlgorithms ->
> TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA
>
>   spark.cassandra.auth.conf.factory ->
> com.datastax.bdp.spark.DseAuthConfFactory
>
>   spark.ssl.protocol -> TLS
>
>   spark.ssl.keyPassword -> cassandra
>
>   spark.ssl.keyStorePassword -> cassandra
>
>
>
>
>
> Main class:
>
> HelloWorld
>
> Arguments:
>
> --jars
>
> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>
> file:/home/missingmerch/dse.jar
>
> file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>
> System properties:
>
> spark.cassandra.connection.factory ->
> com.datastax.bdp.spark.DseCassandraConnectionFactory
>
> spark.driver.memory -> 512M
>
> spark.ssl.useNodeLocalConf -> true
>
> spark.ssl.enabled -> false
>
> SPARK_SUBMIT -> true
>
> spark.executor.extraJavaOptions -> -XX:MaxPermSize=256M
>
> spark.app.name -> HelloWorld
>
> spark.ssl.enabledAlgorithms ->
> TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA
>
> spark.ssl.trustStorePassword -> cassandra
>
> spark.driver.extraJavaOptions -> -Dcassandra.username=missingmerch
> -Dcassandra.password=STMbrjrlb -XX:MaxPermSize=256M
>
> spark.ssl.keyStore -> .keystore
>
> spark.ssl.trustStore -> .truststore
>
> spark.jars -> file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>
> spark.cassandra.auth.conf.factory ->
> com.datastax.bdp.spark.DseAuthConfFactory
>
> spark.master -> local
>
> spark.ssl.protocol -> TLS
>
> spark.ssl.keyPassword -> cassandra
>
> spark.ssl.keyStorePassword -> cassandra
>
> Classpath elements:
>
> file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>
>
>
>
>
> WARN  2015-08-11 08:23:25 org.apache.spark.util.Utils: Service 'SparkUI'
> could not bind on port 4040. Attempting port 4041.
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> org.postgresql.Driver
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
>         at java.lang.Class.forName0(Native Method)
>
>         at java.lang.Class.forName(Class.java:190)
>
>         at HelloWorld$.main(HelloWorld.scala:26)
>
>         at HelloWorld.main(HelloWorld.scala)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:606)
>
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> Regards,
>
> Satish Chandra J
>
> On Tue, Aug 11, 2015 at 6:15 PM, Javier Domingo Cansino <
> javier.domingo@fon.com> wrote:
>
>> use --verbose, it might give you some insights on what0s happening,....
>>
>> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
>> Development Engineer+34 946545847Skype: javier.domingo.fonAll
>> information in this email is confidential
>> <http://corp.fon.com/legal/email-disclaimer>
>>
>> On Tue, Aug 11, 2015 at 2:44 PM, satish chandra j <
>> jsatishchandra@gmail.com> wrote:
>>
>>> HI ,
>>> I have used --jars option as well, please find the command below
>>>
>>> *Command:*
>>>
>>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>>> *--jars* ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>> ///home/missingmerch/dse.jar
>>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>>
>>> Regards,
>>> Satish
>>>
>>> On Tue, Aug 11, 2015 at 4:08 PM, Javier Domingo Cansino <
>>> javier.domingo@fon.com> wrote:
>>>
>>>> I have no real idea (not java user), but have you tried with the --jars
>>>> option?
>>>>
>>>>
>>>> http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
>>>>
>>>> AFAIK, you are currently submitting the jar names as arguments to the
>>>> called Class instead of the jars themselves
>>>>
>>>> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
>>>> Development Engineer+34 946545847Skype: javier.domingo.fonAll
>>>> information in this email is confidential
>>>> <http://corp.fon.com/legal/email-disclaimer>
>>>>
>>>> On Tue, Aug 11, 2015 at 12:29 PM, satish chandra j <
>>>> jsatishchandra@gmail.com> wrote:
>>>>
>>>>>
>>>>> *HI,*
>>>>>
>>>>> Please let me know if i am missing anything in the command below
>>>>>
>>>>> *Command:*
>>>>>
>>>>> dse spark-submit --master spark://10.246.43.15:7077 --class
>>>>> HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>>>> ///home/missingmerch/dse.jar
>>>>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>>>>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>>>>
>>>>>
>>>>> *Error:*
>>>>>
>>>>> *java.lang.ClassNotFoundException: HelloWorld*
>>>>>
>>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>>
>>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>>
>>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>>
>>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>>
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>>
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>>
>>>>>         at java.lang.Class.forName0(Native Method)
>>>>>
>>>>>         at java.lang.Class.forName(Class.java:270)
>>>>>
>>>>>         at
>>>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)
>>>>>
>>>>>         at
>>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>>>
>>>>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>>
>>>>>
>>>>> I understand the way I am giving multiple jar file paths in the
>>>>> command is an issue, please provide an appropriate format for providing
>>>>> multiple jars in the command
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Thanks for support
>>>>>
>>>>>
>>>>> Satish Chandra
>>>>>
>>>>
>>>>
>>>
>>
>

Re: dse spark-submit multiple jars issue

Posted by satish chandra j <js...@gmail.com>.
HI,

Please find the log details below:


dse spark-submit --verbose --master local --class HelloWorld
etl-0.0.1-SNAPSHOT.jar --jars
file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
file:/home/missingmerch/dse.jar
file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar

Using properties file: /etc/dse/spark/spark-defaults.conf

Adding default property:
spark.cassandra.connection.factory=com.datastax.bdp.spark.DseCassandraConnectionFactory

Adding default property: spark.ssl.keyStore=.keystore

Adding default property: spark.ssl.enabled=false

Adding default property: spark.ssl.trustStore=.truststore

Adding default property:
spark.cassandra.auth.conf.factory=com.datastax.bdp.spark.DseAuthConfFactory

Adding default property: spark.ssl.keyPassword=cassandra

Adding default property: spark.ssl.keyStorePassword=cassandra

Adding default property: spark.ssl.protocol=TLS

Adding default property: spark.ssl.useNodeLocalConf=true

Adding default property: spark.ssl.trustStorePassword=cassandra

Adding default property:
spark.ssl.enabledAlgorithms=TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA

Parsed arguments:

  master                  local

  deployMode              null

  executorMemory          null

  executorCores           null

  totalExecutorCores      null

  propertiesFile          /etc/dse/spark/spark-defaults.conf

  driverMemory            512M

  driverCores             null

  driverExtraClassPath    null

  driverExtraLibraryPath  null

  driverExtraJavaOptions  -Dcassandra.username=missingmerch
-Dcassandra.password=STMbrjrlb -XX:MaxPermSize=256M

  supervise               false

  queue                   null

  numExecutors            null

  files                   null

  pyFiles                 null

  archives                null

  mainClass               HelloWorld

  primaryResource         file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar

  name                    HelloWorld

  childArgs               [--jars
file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar
file:/home/missingmerch/dse.jar
file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar]

  jars                    null

  verbose                 true



Spark properties used, including those specified through

--conf and those from the properties file
/etc/dse/spark/spark-defaults.conf:

  spark.cassandra.connection.factory ->
com.datastax.bdp.spark.DseCassandraConnectionFactory

  spark.ssl.useNodeLocalConf -> true

  spark.ssl.enabled -> false

  spark.executor.extraJavaOptions -> -XX:MaxPermSize=256M

  spark.ssl.keyStore -> .keystore

  spark.ssl.trustStore -> .truststore

  spark.ssl.trustStorePassword -> cassandra

  spark.ssl.enabledAlgorithms ->
TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA

  spark.cassandra.auth.conf.factory ->
com.datastax.bdp.spark.DseAuthConfFactory

  spark.ssl.protocol -> TLS

  spark.ssl.keyPassword -> cassandra

  spark.ssl.keyStorePassword -> cassandra





Main class:

HelloWorld

Arguments:

--jars

file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar

file:/home/missingmerch/dse.jar

file:/home/missingmerch/postgresql-9.4-1201.jdbc41.jar

System properties:

spark.cassandra.connection.factory ->
com.datastax.bdp.spark.DseCassandraConnectionFactory

spark.driver.memory -> 512M

spark.ssl.useNodeLocalConf -> true

spark.ssl.enabled -> false

SPARK_SUBMIT -> true

spark.executor.extraJavaOptions -> -XX:MaxPermSize=256M

spark.app.name -> HelloWorld

spark.ssl.enabledAlgorithms ->
TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA

spark.ssl.trustStorePassword -> cassandra

spark.driver.extraJavaOptions -> -Dcassandra.username=missingmerch
-Dcassandra.password=STMbrjrlb -XX:MaxPermSize=256M

spark.ssl.keyStore -> .keystore

spark.ssl.trustStore -> .truststore

spark.jars -> file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar

spark.cassandra.auth.conf.factory ->
com.datastax.bdp.spark.DseAuthConfFactory

spark.master -> local

spark.ssl.protocol -> TLS

spark.ssl.keyPassword -> cassandra

spark.ssl.keyStorePassword -> cassandra

Classpath elements:

file:/home/missingmerch/etl-0.0.1-SNAPSHOT.jar





WARN  2015-08-11 08:23:25 org.apache.spark.util.Utils: Service 'SparkUI'
could not bind on port 4040. Attempting port 4041.

Exception in thread "main" java.lang.ClassNotFoundException:
org.postgresql.Driver

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

        at java.lang.Class.forName0(Native Method)

        at java.lang.Class.forName(Class.java:190)

        at HelloWorld$.main(HelloWorld.scala:26)

        at HelloWorld.main(HelloWorld.scala)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:606)

        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)

        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


Regards,

Satish Chandra J

On Tue, Aug 11, 2015 at 6:15 PM, Javier Domingo Cansino <
javier.domingo@fon.com> wrote:

> use --verbose, it might give you some insights on what0s happening,....
>
> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
> Development Engineer+34 946545847Skype: javier.domingo.fonAll information
> in this email is confidential <http://corp.fon.com/legal/email-disclaimer>
>
> On Tue, Aug 11, 2015 at 2:44 PM, satish chandra j <
> jsatishchandra@gmail.com> wrote:
>
>> HI ,
>> I have used --jars option as well, please find the command below
>>
>> *Command:*
>>
>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>> *--jars* ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>> ///home/missingmerch/dse.jar
>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>
>> Regards,
>> Satish
>>
>> On Tue, Aug 11, 2015 at 4:08 PM, Javier Domingo Cansino <
>> javier.domingo@fon.com> wrote:
>>
>>> I have no real idea (not java user), but have you tried with the --jars
>>> option?
>>>
>>>
>>> http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
>>>
>>> AFAIK, you are currently submitting the jar names as arguments to the
>>> called Class instead of the jars themselves
>>>
>>> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
>>> Development Engineer+34 946545847Skype: javier.domingo.fonAll
>>> information in this email is confidential
>>> <http://corp.fon.com/legal/email-disclaimer>
>>>
>>> On Tue, Aug 11, 2015 at 12:29 PM, satish chandra j <
>>> jsatishchandra@gmail.com> wrote:
>>>
>>>>
>>>> *HI,*
>>>>
>>>> Please let me know if i am missing anything in the command below
>>>>
>>>> *Command:*
>>>>
>>>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>>>> --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>>> ///home/missingmerch/dse.jar
>>>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>>>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>>>
>>>>
>>>> *Error:*
>>>>
>>>> *java.lang.ClassNotFoundException: HelloWorld*
>>>>
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>>
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>>
>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>
>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>>
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>>
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>>
>>>>         at java.lang.Class.forName0(Native Method)
>>>>
>>>>         at java.lang.Class.forName(Class.java:270)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)
>>>>
>>>>         at
>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>>
>>>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>
>>>>
>>>> I understand the way I am giving multiple jar file paths in the command
>>>> is an issue, please provide an appropriate format for providing multiple
>>>> jars in the command
>>>>
>>>>
>>>>
>>>>
>>>> Thanks for support
>>>>
>>>>
>>>> Satish Chandra
>>>>
>>>
>>>
>>
>

Re: dse spark-submit multiple jars issue

Posted by Javier Domingo Cansino <ja...@fon.com>.
use --verbose, it might give you some insights on what0s happening,....

[image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
Development Engineer+34 946545847Skype: javier.domingo.fonAll information
in this email is confidential <http://corp.fon.com/legal/email-disclaimer>

On Tue, Aug 11, 2015 at 2:44 PM, satish chandra j <js...@gmail.com>
wrote:

> HI ,
> I have used --jars option as well, please find the command below
>
> *Command:*
>
> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
> *--jars* ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
> ///home/missingmerch/dse.jar
> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>
> Regards,
> Satish
>
> On Tue, Aug 11, 2015 at 4:08 PM, Javier Domingo Cansino <
> javier.domingo@fon.com> wrote:
>
>> I have no real idea (not java user), but have you tried with the --jars
>> option?
>>
>>
>> http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
>>
>> AFAIK, you are currently submitting the jar names as arguments to the
>> called Class instead of the jars themselves
>>
>> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
>> Development Engineer+34 946545847Skype: javier.domingo.fonAll
>> information in this email is confidential
>> <http://corp.fon.com/legal/email-disclaimer>
>>
>> On Tue, Aug 11, 2015 at 12:29 PM, satish chandra j <
>> jsatishchandra@gmail.com> wrote:
>>
>>>
>>> *HI,*
>>>
>>> Please let me know if i am missing anything in the command below
>>>
>>> *Command:*
>>>
>>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>>> --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>>> ///home/missingmerch/dse.jar
>>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>>
>>>
>>> *Error:*
>>>
>>> *java.lang.ClassNotFoundException: HelloWorld*
>>>
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>
>>>         at java.lang.Class.forName0(Native Method)
>>>
>>>         at java.lang.Class.forName(Class.java:270)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)
>>>
>>>         at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>
>>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>>
>>> I understand the way I am giving multiple jar file paths in the command
>>> is an issue, please provide an appropriate format for providing multiple
>>> jars in the command
>>>
>>>
>>>
>>>
>>> Thanks for support
>>>
>>>
>>> Satish Chandra
>>>
>>
>>
>

Re: dse spark-submit multiple jars issue

Posted by satish chandra j <js...@gmail.com>.
HI ,
I have used --jars option as well, please find the command below

*Command:*

dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
*--jars* ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
///home/missingmerch/dse.jar
///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
///home/missingmerch/etl-0.0.1-SNAPSHOT.jar

Regards,
Satish

On Tue, Aug 11, 2015 at 4:08 PM, Javier Domingo Cansino <
javier.domingo@fon.com> wrote:

> I have no real idea (not java user), but have you tried with the --jars
> option?
>
>
> http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management
>
> AFAIK, you are currently submitting the jar names as arguments to the
> called Class instead of the jars themselves
>
> [image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
> Development Engineer+34 946545847Skype: javier.domingo.fonAll information
> in this email is confidential <http://corp.fon.com/legal/email-disclaimer>
>
> On Tue, Aug 11, 2015 at 12:29 PM, satish chandra j <
> jsatishchandra@gmail.com> wrote:
>
>>
>> *HI,*
>>
>> Please let me know if i am missing anything in the command below
>>
>> *Command:*
>>
>> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
>> --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
>> ///home/missingmerch/dse.jar
>> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
>> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>>
>>
>> *Error:*
>>
>> *java.lang.ClassNotFoundException: HelloWorld*
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>
>>         at java.security.AccessController.doPrivileged(Native Method)
>>
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>
>>         at java.lang.Class.forName0(Native Method)
>>
>>         at java.lang.Class.forName(Class.java:270)
>>
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)
>>
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>> I understand the way I am giving multiple jar file paths in the command
>> is an issue, please provide an appropriate format for providing multiple
>> jars in the command
>>
>>
>>
>>
>> Thanks for support
>>
>>
>> Satish Chandra
>>
>
>

Re: dse spark-submit multiple jars issue

Posted by Javier Domingo Cansino <ja...@fon.com>.
I have no real idea (not java user), but have you tried with the --jars
option?

http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management

AFAIK, you are currently submitting the jar names as arguments to the
called Class instead of the jars themselves

[image: Fon] <http://www.fon.com/>Javier Domingo CansinoResearch &
Development Engineer+34 946545847Skype: javier.domingo.fonAll information
in this email is confidential <http://corp.fon.com/legal/email-disclaimer>

On Tue, Aug 11, 2015 at 12:29 PM, satish chandra j <jsatishchandra@gmail.com
> wrote:

>
> *HI,*
>
> Please let me know if i am missing anything in the command below
>
> *Command:*
>
> dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld
> --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar
> ///home/missingmerch/dse.jar
> ///home/missingmerch/spark-cassandra-connector-java_2.10-1.1.1.jar
> ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar
>
>
> *Error:*
>
> *java.lang.ClassNotFoundException: HelloWorld*
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
>         at java.lang.Class.forName0(Native Method)
>
>         at java.lang.Class.forName(Class.java:270)
>
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)
>
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> I understand the way I am giving multiple jar file paths in the command is
> an issue, please provide an appropriate format for providing multiple jars
> in the command
>
>
>
>
> Thanks for support
>
>
> Satish Chandra
>