You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Mich Talebzadeh <mi...@peridale.co.uk> on 2015/12/03 18:54:08 UTC

Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Trying to run Hive on Spark 1.3 engine, I get

 

conf hive.spark.client.channel.log.level=null --conf
hive.spark.client.rpc.max.size=52428800 --conf
hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256

15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
assembly has been built with Hive, including Datanucleus jars on classpath

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
Ignoring non-spark config property:
hive.spark.client.server.connect.timeout=90000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception
in thread "main" java.lang.NoSuchFieldError:
SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.
java:46)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$ru
nMain(SparkSubmit.scala:569)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

Any clues?

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

 
<http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908
.pdf>
http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.
pdf

Author of the books "A Practitioner's Guide to Upgrading to Sybase ASE 15",
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN:
978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume
one out shortly

 

 <http://talebzadehmich.wordpress.com/> http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by Peridale Technology
Ltd, its subsidiaries or their employees, unless expressly so stated. It is
the responsibility of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept
any responsibility.

 


Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Marcelo Vanzin <va...@cloudera.com>.
(bcc: user@spark, since this is Hive code.)

You're probably including unneeded Spark jars in Hive's classpath
somehow. Either the whole assembly or spark-hive, both of which will
contain Hive classes, and in this case contain old versions that
conflict with the version of Hive you're running.

On Thu, Dec 3, 2015 at 9:54 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:
> Trying to run Hive on Spark 1.3 engine, I get
>
>
>
> conf hive.spark.client.channel.log.level=null --conf
> hive.spark.client.rpc.max.size=52428800 --conf
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
> assembly has been built with Hive, including Datanucleus jars on classpath
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property:
> hive.spark.client.server.connect.timeout=90000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
> 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception
> in thread "main" java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> java.lang.reflect.Method.invoke(Method.java:606)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> Any clues?
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> Sybase ASE 15 Gold Medal Award 2008
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15",
> ISBN 978-0-9563693-0-7.
>
> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4
>
> Publications due shortly:
>
> Complex Event Processing in Heterogeneous Environments, ISBN:
> 978-0-9563693-3-8
>
> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
>



-- 
Marcelo

Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Ted Yu <yu...@gmail.com>.
Looks like SPARK_RPC_CLIENT_CONNECT_TIMEOUT was introduced by SPARK-8064
which went into 1.5.0

Please check your classpath as Marcelo suggested.

On Thu, Dec 3, 2015 at 10:15 AM, Mich Talebzadeh <mi...@peridale.co.uk>
wrote:

> Thanks I tried all L
>
>
>
> I am trying to make Hive use Spark and apparently Hive can use version 1.3
> of Spark as execution engine. Frankly I don’t know why this is not working!
>
>
>
> Mich Talebzadeh
>
>
>
> *Sybase ASE 15 Gold Medal Award 2008*
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
> 15", ISBN 978-0-9563693-0-7*.
>
> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4*
>
> *Publications due shortly:*
>
> *Complex Event Processing in Heterogeneous Environments*, ISBN:
> 978-0-9563693-3-8
>
> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
> *From:* Furcy Pin [mailto:furcy.pin@flaminem.com]
> *Sent:* 03 December 2015 18:07
> *To:* user@hive.apache.org
> *Cc:* user@spark.apache.org
> *Subject:* Re: Any clue on this error, Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
>
>
> maybe you compile and run against different versions of spark?
>
>
>
> On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mi...@peridale.co.uk>
> wrote:
>
> Trying to run Hive on Spark 1.3 engine, I get
>
>
>
> conf hive.spark.client.channel.log.level=null --conf
> hive.spark.client.rpc.max.size=52428800 --conf
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
> assembly has been built with Hive, including Datanucleus jars on classpath
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property:
> hive.spark.client.server.connect.timeout=90000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
> 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577
>
> *15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:
> Exception in thread "main" java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT*
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> java.lang.reflect.Method.invoke(Method.java:606)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> Any clues?
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> *Sybase ASE 15 Gold Medal Award 2008*
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
> 15", ISBN 978-0-9563693-0-7*.
>
> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4*
>
> *Publications due shortly:*
>
> *Complex Event Processing in Heterogeneous Environments*, ISBN:
> 978-0-9563693-3-8
>
> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
>
>

RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Thanks downloaded the one suggested,

 

Unfortunately I get the following error when I try start-master.sh

 

hduser@rhes564::/home/hduser> start-master.sh

starting org.apache.spark.deploy.master.Master, logging to /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

failed to launch org.apache.spark.deploy.master.Master:

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

full log in /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

hduser@rhes564::/home/hduser> cat /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

Spark Command: /usr/java/latest/bin/java -cp /usr/lib/spark/sbin/../conf/:/usr/lib/spark/lib/spark-assembly-1.5.2-hadoop2.2.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop/ -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip rhes564 --port 7077 --webui-port 8080

========================================

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger

        at java.lang.Class.getDeclaredMethods0(Native Method)

        at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

        at java.lang.Class.getMethod0(Class.java:2764)

        at java.lang.Class.getMethod(Class.java:1653)

        at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

 

This is I suspect as before due to having fewer jar files in $SPARK_HOME/lib

 

hduser@rhes564::/usr/lib/spark/lib> ltr

total 207620

-rw-r--r-- 1 hduser hadoop 102573339 Nov  3 18:04 spark-examples-1.5.2-hadoop2.2.0.jar

-rw-r--r-- 1 hduser hadoop 105357751 Nov  3 18:04 spark-assembly-1.5.2-hadoop2.2.0.jar

-rw-r--r-- 1 hduser hadoop   4433953 Nov  3 18:04 spark-1.5.2-yarn-shuffle.jar

 

 

Compared to prebuild one

 

hduser@rhes564::/usr/lib/spark_ori/lib> ltr

total 303876

-rw-r--r-- 1 hduser hadoop 118360126 Nov  3 18:05 spark-examples-1.5.2-hadoop2.6.0.jar

-rw-r--r-- 1 hduser hadoop 183993445 Nov  3 18:05 spark-assembly-1.5.2-hadoop2.6.0.jar

-rw-r--r-- 1 hduser hadoop   4433953 Nov  3 18:05 spark-1.5.2-yarn-shuffle.jar

-rw-r--r-- 1 hduser hadoop   1809447 Nov  3 18:05 datanucleus-rdbms-3.2.9.jar

-rw-r--r-- 1 hduser hadoop   1890075 Nov  3 18:05 datanucleus-core-3.2.10.jar

-rw-r--r-- 1 hduser hadoop    339666 Nov  3 18:05 datanucleus-api-jdo-3.2.6.jar

 

 

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: 03 December 2015 23:44
To: Mich Talebzadeh <mi...@peridale.co.uk>
Cc: user@hive.apache.org
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

I spoke to Xuefu (Hive dev) and mentioned that this isn't really how it should be done.

 

In the meantime, if you can, you should use a Spark package that does not include Hive classes. There used to be an explicit one for that, but I can't find it. In the meantime, the tarball that says "pre-built with user-provided Hadoop" should work for your case.

 

On Thu, Dec 3, 2015 at 3:41 PM, Mich Talebzadeh < <ma...@peridale.co.uk> mich@peridale.co.uk> wrote:

> Just noticed that hive shell in 1.2.1 makes a reference to SPARK_HOME 

> if it finds it

> 

> 

> 

> 

> 

> # add Spark assembly jar to the classpath

> 

> if [[ -n "$SPARK_HOME" ]]

> 

> then

> 

>   sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`

> 

>   CLASSPATH="${CLASSPATH}:${sparkAssemblyPath}"

> 

> fi

> 

> 

> 

> 

> 

> Is this expected?

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award 2008

> 

> A Winning Strategy: Running the most Critical Financial Data on ASE 15

> 

>  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 

> 15", ISBN 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN

> 978-0-9759693-0-4

> 

> Publications due shortly:

> 

> Complex Event Processing in Heterogeneous Environments, ISBN:

> 978-0-9563693-3-8

> 

> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, 

> volume one out shortly

> 

> 

> 

>  <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

> 

> 

> 

> NOTE: The information in this email is proprietary and confidential. 

> This message is for the designated recipient only, if you are not the 

> intended recipient, you should destroy it immediately. Any information 

> in this message shall not be understood as given or endorsed by 

> Peridale Technology Ltd, its subsidiaries or their employees, unless 

> expressly so stated. It is the responsibility of the recipient to 

> ensure that this email is virus free, therefore neither Peridale Ltd, 

> its subsidiaries nor their employees accept any responsibility.

> 

> 

> 

> From: Mich Talebzadeh [ <ma...@peridale.co.uk> mailto:mich@peridale.co.uk]

> Sent: 03 December 2015 19:46

> To:  <ma...@hive.apache.org> user@hive.apache.org; 'Marcelo Vanzin' < <ma...@cloudera.com> vanzin@cloudera.com>

> 

> 

> Subject: RE: Any clue on this error, Exception in thread "main"

> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> Hi,

> 

> 

> 

> This is my CLASSPATH which I have simplified running with Hive 1.2.1 

> and generic build Spark 1.3

> 

> 

> 

> unset CLASSPATH

> 

> CLASSPATH=$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-tests.j

> ar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs

> -2.6.0.jar:$HIVE_HOME/lib:${SPARK_HOME}/lib

> 

> 

> 

> echo $CLASSPATH

> 

> export CLASSPATH

> 

> 

> 

> 

> 

> CLASPPATH IS now

> 

> 

> 

> /home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-test

> s.jar:/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.

> 0.jar:hadoop-nfs-2.6.0.jar:/usr/lib/hive/lib:/usr/lib/spark/lib

> 

> 

> 

> However, I get the error. Does anyone has a working CLASSPATH for this?

> 

> 

> 

> 

> 

> 

> 

> .spark.client.RemoteDriver /usr/lib/hive/lib/hive-exec-1.2.1.jar

> --remote-host rhes564 --remote-port 51642 --conf

> hive.spark.client.connect.timeout=1000 --conf

> hive.spark.client.server.connect.timeout=90000 --conf 

> hive.spark.client.channel.log.level=null --conf

> hive.spark.client.rpc.max.size=52428800 --conf

> hive.spark.client.rpc.threads=8 --conf 

> hive.spark.client.secret.bits=256

> 

> 15/12/03 19:42:51 [stderr-redir-1]: INFO client.SparkClientImpl: Spark 

> assembly has been built with Hive, including Datanucleus jars on 

> classpath

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: 

> hive.spark.client.connect.timeout=1000

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: hive.spark.client.rpc.threads=8

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: 

> hive.spark.client.rpc.max.size=52428800

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: hive.spark.client.secret.bits=256

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property:

> hive.spark.client.server.connect.timeout=90000

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: 

> 15/12/03

> 19:42:52 INFO client.RemoteDriver: Connecting to: rhes564:51642

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: 

> Exception in thread "main" java.lang.NoSuchFieldError:

> SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfigur

> ation.java:46)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146

> )

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j

> ava:57)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess

> orImpl.java:43)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> java.lang.reflect.Method.invoke(Method.java:606)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubm

> it$$runMain(SparkSubmit.scala:569)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166

> )

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

> 

> 

> 

> 

> 

> 

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award 2008

> 

> A Winning Strategy: Running the most Critical Financial Data on ASE 15

> 

>  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 

> 15", ISBN 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN

> 978-0-9759693-0-4

> 

> Publications due shortly:

> 

> Complex Event Processing in Heterogeneous Environments, ISBN:

> 978-0-9563693-3-8

> 

> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, 

> volume one out shortly

> 

> 

> 

>  <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

> 

> 

> 

> NOTE: The information in this email is proprietary and confidential. 

> This message is for the designated recipient only, if you are not the 

> intended recipient, you should destroy it immediately. Any information 

> in this message shall not be understood as given or endorsed by 

> Peridale Technology Ltd, its subsidiaries or their employees, unless 

> expressly so stated. It is the responsibility of the recipient to 

> ensure that this email is virus free, therefore neither Peridale Ltd, 

> its subsidiaries nor their employees accept any responsibility.

> 

> 

> 

> 

> 

> -----Original Message-----

> From: Mich Talebzadeh [ <ma...@peridale.co.uk> mailto:mich@peridale.co.uk]

> Sent: 03 December 2015 19:02

> To: 'Marcelo Vanzin' < <ma...@cloudera.com> vanzin@cloudera.com>

> Cc:  <ma...@hive.apache.org> user@hive.apache.org; 'user' < <ma...@spark.apache.org> user@spark.apache.org>

> Subject: RE: Any clue on this error, Exception in thread "main"

> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> Hi Marcelo.

> 

> 

> 

> So this is the approach I am going to take:

> 

> 

> 

> Use spark 1.3 pre-built

> 

> Use Hive 1.2.1. Do not copy over anything to add to hive libraries 

> from spark 1.3 libraries Use Hadoop 2.6

> 

> 

> 

> There is no need to mess around with the libraries. I will try to 

> unset my CLASSPATH and reset again and try again

> 

> 

> 

> 

> 

> Thanks,

> 

> 

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award 2008

> 

> A Winning Strategy: Running the most Critical Financial Data on ASE 15 

>  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 

> 15", ISBN 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN

> 978-0-9759693-0-4 Publications due shortly:

> 

> Complex Event Processing in Heterogeneous Environments, ISBN:

> 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN:

> 978-0-9563693-1-4, volume one out shortly

> 

> 

> 

>  <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

> 

> 

> 

> NOTE: The information in this email is proprietary and confidential. 

> This message is for the designated recipient only, if you are not the 

> intended recipient, you should destroy it immediately. Any information 

> in this message shall not be understood as given or endorsed by 

> Peridale Technology Ltd, its subsidiaries or their employees, unless 

> expressly so stated. It is the responsibility of the recipient to 

> ensure that this email is virus free, therefore neither Peridale Ltd, 

> its subsidiaries nor their employees accept any responsibility.

> 

> 

> 

> -----Original Message-----

> 

> From: Marcelo Vanzin [ <ma...@cloudera.com> mailto:vanzin@cloudera.com]

> 

> Sent: 03 December 2015 18:45

> 

> To: Mich Talebzadeh < <ma...@peridale.co.uk> mich@peridale.co.uk>

> 

> Cc:  <ma...@hive.apache.org> user@hive.apache.org; user < <ma...@spark.apache.org> user@spark.apache.org>

> 

> Subject: Re: Any clue on this error, Exception in thread "main"

> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh < <ma...@peridale.co.uk> mich@peridale.co.uk>

> wrote:

> 

> 

> 

>> hduser@rhes564::/usr/lib/spark/logs> hive --version

> 

>> SLF4J: Found binding in

> 

>> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/or

>> g

> 

>> /slf4j/impl/StaticLoggerBinder.class]

> 

> 

> 

> As I suggested before, you have Spark's assembly in the Hive classpath.

> That's not the way to configure hive-on-spark; if the documentation 

> you're following tells you to do that, it's wrong.

> 

> 

> 

> (And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark 

> should work fine with Spark 1.3 if it's configured correctly. You 

> really don't want to be overriding Hive classes with the ones shipped 

> in the Spark assembly, regardless of the version of Spark being used.)

> 

> 

> 

> --

> 

> Marcelo

> 

> 

> 

> ---------------------------------------------------------------------

> 

> To unsubscribe, e-mail:  <ma...@spark.apache.org> user-unsubscribe@spark.apache.org For 

> additional commands, e-mail:  <ma...@spark.apache.org> user-help@spark.apache.org

 

 

 

--

Marcelo


Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Marcelo Vanzin <va...@cloudera.com>.
I spoke to Xuefu (Hive dev) and mentioned that this isn't really how
it should be done.

In the meantime, if you can, you should use a Spark package that does
not include Hive classes. There used to be an explicit one for that,
but I can't find it. In the meantime, the tarball that says "pre-built
with user-provided Hadoop" should work for your case.

On Thu, Dec 3, 2015 at 3:41 PM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:
> Just noticed that hive shell in 1.2.1 makes a reference to SPARK_HOME if it
> finds it
>
>
>
>
>
> # add Spark assembly jar to the classpath
>
> if [[ -n "$SPARK_HOME" ]]
>
> then
>
>   sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`
>
>   CLASSPATH="${CLASSPATH}:${sparkAssemblyPath}"
>
> fi
>
>
>
>
>
> Is this expected?
>
>
>
> Mich Talebzadeh
>
>
>
> Sybase ASE 15 Gold Medal Award 2008
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15",
> ISBN 978-0-9563693-0-7.
>
> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4
>
> Publications due shortly:
>
> Complex Event Processing in Heterogeneous Environments, ISBN:
> 978-0-9563693-3-8
>
> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
>
>
> From: Mich Talebzadeh [mailto:mich@peridale.co.uk]
> Sent: 03 December 2015 19:46
> To: user@hive.apache.org; 'Marcelo Vanzin' <va...@cloudera.com>
>
>
> Subject: RE: Any clue on this error, Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
>
>
> Hi,
>
>
>
> This is my CLASSPATH which I have simplified running with Hive 1.2.1 and
> generic build Spark 1.3
>
>
>
> unset CLASSPATH
>
> CLASSPATH=$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-tests.jar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs-2.6.0.jar:$HIVE_HOME/lib:${SPARK_HOME}/lib
>
>
>
> echo $CLASSPATH
>
> export CLASSPATH
>
>
>
>
>
> CLASPPATH IS now
>
>
>
> /home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs-2.6.0.jar:/usr/lib/hive/lib:/usr/lib/spark/lib
>
>
>
> However, I get the error. Does anyone has a working CLASSPATH for this?
>
>
>
>
>
>
>
> .spark.client.RemoteDriver /usr/lib/hive/lib/hive-exec-1.2.1.jar
> --remote-host rhes564 --remote-port 51642 --conf
> hive.spark.client.connect.timeout=1000 --conf
> hive.spark.client.server.connect.timeout=90000 --conf
> hive.spark.client.channel.log.level=null --conf
> hive.spark.client.rpc.max.size=52428800 --conf
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>
> 15/12/03 19:42:51 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
> assembly has been built with Hive, including Datanucleus jars on classpath
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property:
> hive.spark.client.server.connect.timeout=90000
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
> 19:42:52 INFO client.RemoteDriver: Connecting to: rhes564:51642
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Exception
> in thread "main" java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> java.lang.reflect.Method.invoke(Method.java:606)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>
> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
>
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> Sybase ASE 15 Gold Medal Award 2008
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15",
> ISBN 978-0-9563693-0-7.
>
> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4
>
> Publications due shortly:
>
> Complex Event Processing in Heterogeneous Environments, ISBN:
> 978-0-9563693-3-8
>
> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
>
>
>
>
> -----Original Message-----
> From: Mich Talebzadeh [mailto:mich@peridale.co.uk]
> Sent: 03 December 2015 19:02
> To: 'Marcelo Vanzin' <va...@cloudera.com>
> Cc: user@hive.apache.org; 'user' <us...@spark.apache.org>
> Subject: RE: Any clue on this error, Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
>
>
> Hi Marcelo.
>
>
>
> So this is the approach I am going to take:
>
>
>
> Use spark 1.3 pre-built
>
> Use Hive 1.2.1. Do not copy over anything to add to hive libraries from
> spark 1.3 libraries Use Hadoop 2.6
>
>
>
> There is no need to mess around with the libraries. I will try to unset my
> CLASSPATH and reset again and try again
>
>
>
>
>
> Thanks,
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> Sybase ASE 15 Gold Medal Award 2008
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15",
> ISBN 978-0-9563693-0-7.
>
> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4 Publications due shortly:
>
> Complex Event Processing in Heterogeneous Environments, ISBN:
> 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN:
> 978-0-9563693-1-4, volume one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
>
>
> -----Original Message-----
>
> From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
>
> Sent: 03 December 2015 18:45
>
> To: Mich Talebzadeh <mi...@peridale.co.uk>
>
> Cc: user@hive.apache.org; user <us...@spark.apache.org>
>
> Subject: Re: Any clue on this error, Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
>
>
> On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <mi...@peridale.co.uk>
> wrote:
>
>
>
>> hduser@rhes564::/usr/lib/spark/logs> hive --version
>
>> SLF4J: Found binding in
>
>> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org
>
>> /slf4j/impl/StaticLoggerBinder.class]
>
>
>
> As I suggested before, you have Spark's assembly in the Hive classpath.
> That's not the way to configure hive-on-spark; if the documentation you're
> following tells you to do that, it's wrong.
>
>
>
> (And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark should
> work fine with Spark 1.3 if it's configured correctly. You really don't want
> to be overriding Hive classes with the ones shipped in the Spark assembly,
> regardless of the version of Spark being used.)
>
>
>
> --
>
> Marcelo
>
>
>
> ---------------------------------------------------------------------
>
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional
> commands, e-mail: user-help@spark.apache.org



-- 
Marcelo

RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Just noticed that hive shell in 1.2.1 makes a reference to SPARK_HOME if it finds it

 

 

# add Spark assembly jar to the classpath

if [[ -n "$SPARK_HOME" ]]

then

  sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`

  CLASSPATH="${CLASSPATH}:${sparkAssemblyPath}"

fi

 

 

Is this expected?

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

 <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

 <http://talebzadehmich.wordpress.com/> http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Mich Talebzadeh [mailto:mich@peridale.co.uk] 
Sent: 03 December 2015 19:46
To: user@hive.apache.org; 'Marcelo Vanzin' <va...@cloudera.com>
Subject: RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

Hi,

 

This is my CLASSPATH which I have simplified running with Hive 1.2.1 and generic build Spark 1.3

 

unset CLASSPATH

CLASSPATH=$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-tests.jar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs-2.6.0.jar:$HIVE_HOME/lib:${SPARK_HOME}/lib

 

echo $CLASSPATH

export CLASSPATH

 

 

CLASPPATH IS now

 

/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs-2.6.0.jar:/usr/lib/hive/lib:/usr/lib/spark/lib

 

However, I get the error. Does anyone has a working CLASSPATH for this?

 

 

 

.spark.client.RemoteDriver /usr/lib/hive/lib/hive-exec-1.2.1.jar --remote-host rhes564 --remote-port 51642 --conf hive.spark.client.connect.timeout=1000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256

15/12/03 19:42:51 [stderr-redir-1]: INFO client.SparkClientImpl: Spark assembly has been built with Hive, including Datanucleus jars on classpath

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 19:42:52 INFO client.RemoteDriver: Connecting to: rhes564:51642

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

 

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 

-----Original Message-----
From: Mich Talebzadeh [mailto:mich@peridale.co.uk] 
Sent: 03 December 2015 19:02
To: 'Marcelo Vanzin' <vanzin@cloudera.com <ma...@cloudera.com> >
Cc: user@hive.apache.org <ma...@hive.apache.org> ; 'user' <user@spark.apache.org <ma...@spark.apache.org> >
Subject: RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

Hi Marcelo.

 

So this is the approach I am going to take:

 

Use spark 1.3 pre-built

Use Hive 1.2.1. Do not copy over anything to add to hive libraries from spark 1.3 libraries Use Hadoop 2.6

 

There is no need to mess around with the libraries. I will try to unset my CLASSPATH and reset again and try again

 

 

Thanks,

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4 Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

 <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

-----Original Message-----

From: Marcelo Vanzin [ <ma...@cloudera.com> mailto:vanzin@cloudera.com]

Sent: 03 December 2015 18:45

To: Mich Talebzadeh < <ma...@peridale.co.uk> mich@peridale.co.uk>

Cc:  <ma...@hive.apache.org> user@hive.apache.org; user < <ma...@spark.apache.org> user@spark.apache.org>

Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh < <ma...@peridale.co.uk> mich@peridale.co.uk> wrote:

 

> hduser@rhes564::/usr/lib/spark/logs> hive --version

> SLF4J: Found binding in

> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org

> /slf4j/impl/StaticLoggerBinder.class]

 

As I suggested before, you have Spark's assembly in the Hive classpath. That's not the way to configure hive-on-spark; if the documentation you're following tells you to do that, it's wrong.

 

(And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark should work fine with Spark 1.3 if it's configured correctly. You really don't want to be overriding Hive classes with the ones shipped in the Spark assembly, regardless of the version of Spark being used.)

 

--

Marcelo

 

---------------------------------------------------------------------

To unsubscribe, e-mail:  <ma...@spark.apache.org> user-unsubscribe@spark.apache.org For additional commands, e-mail:  <ma...@spark.apache.org> user-help@spark.apache.org


RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Hi,

 

This is my CLASSPATH which I have simplified running with Hive 1.2.1 and generic build Spark 1.3

 

unset CLASSPATH

CLASSPATH=$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-tests.jar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs-2.6.0.jar:$HIVE_HOME/lib:${SPARK_HOME}/lib

 

echo $CLASSPATH

export CLASSPATH

 

 

CLASPPATH IS now

 

/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs-2.6.0.jar:/usr/lib/hive/lib:/usr/lib/spark/lib

 

However, I get the error. Does anyone has a working CLASSPATH for this?

 

 

 

.spark.client.RemoteDriver /usr/lib/hive/lib/hive-exec-1.2.1.jar --remote-host rhes564 --remote-port 51642 --conf hive.spark.client.connect.timeout=1000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256

15/12/03 19:42:51 [stderr-redir-1]: INFO client.SparkClientImpl: Spark assembly has been built with Hive, including Datanucleus jars on classpath

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 19:42:52 INFO client.RemoteDriver: Connecting to: rhes564:51642

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

 

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 

-----Original Message-----
From: Mich Talebzadeh [mailto:mich@peridale.co.uk] 
Sent: 03 December 2015 19:02
To: 'Marcelo Vanzin' <va...@cloudera.com>
Cc: user@hive.apache.org; 'user' <us...@spark.apache.org>
Subject: RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

Hi Marcelo.

 

So this is the approach I am going to take:

 

Use spark 1.3 pre-built

Use Hive 1.2.1. Do not copy over anything to add to hive libraries from spark 1.3 libraries Use Hadoop 2.6

 

There is no need to mess around with the libraries. I will try to unset my CLASSPATH and reset again and try again

 

 

Thanks,

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4 Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

 <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

-----Original Message-----

From: Marcelo Vanzin [ <ma...@cloudera.com> mailto:vanzin@cloudera.com]

Sent: 03 December 2015 18:45

To: Mich Talebzadeh < <ma...@peridale.co.uk> mich@peridale.co.uk>

Cc:  <ma...@hive.apache.org> user@hive.apache.org; user < <ma...@spark.apache.org> user@spark.apache.org>

Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh < <ma...@peridale.co.uk> mich@peridale.co.uk> wrote:

 

> hduser@rhes564::/usr/lib/spark/logs> hive --version

> SLF4J: Found binding in

> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org

> /slf4j/impl/StaticLoggerBinder.class]

 

As I suggested before, you have Spark's assembly in the Hive classpath. That's not the way to configure hive-on-spark; if the documentation you're following tells you to do that, it's wrong.

 

(And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark should work fine with Spark 1.3 if it's configured correctly. You really don't want to be overriding Hive classes with the ones shipped in the Spark assembly, regardless of the version of Spark being used.)

 

--

Marcelo

 

---------------------------------------------------------------------

To unsubscribe, e-mail:  <ma...@spark.apache.org> user-unsubscribe@spark.apache.org For additional commands, e-mail:  <ma...@spark.apache.org> user-help@spark.apache.org


RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Hi Marcelo.

So this is the approach I am going to take:

Use spark 1.3 pre-built
Use Hive 1.2.1. Do not copy over anything to add to hive libraries from spark 1.3 libraries
Use Hadoop 2.6

There is no need to mess around with the libraries. I will try to unset my CLASSPATH and reset again and try again


Thanks,


Mich Talebzadeh

Sybase ASE 15 Gold Medal Award 2008
A Winning Strategy: Running the most Critical Financial Data on ASE 15
http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 
co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4
Publications due shortly:
Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8
Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

http://talebzadehmich.wordpress.com

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: 03 December 2015 18:45
To: Mich Talebzadeh <mi...@peridale.co.uk>
Cc: user@hive.apache.org; user <us...@spark.apache.org>
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> hduser@rhes564::/usr/lib/spark/logs> hive --version
> SLF4J: Found binding in
> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org
> /slf4j/impl/StaticLoggerBinder.class]

As I suggested before, you have Spark's assembly in the Hive classpath. That's not the way to configure hive-on-spark; if the documentation you're following tells you to do that, it's wrong.

(And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark should work fine with Spark 1.3 if it's configured correctly. You really don't want to be overriding Hive classes with the ones shipped in the Spark assembly, regardless of the version of Spark being used.)

--
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail: user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Hi Marcelo.

So this is the approach I am going to take:

Use spark 1.3 pre-built
Use Hive 1.2.1. Do not copy over anything to add to hive libraries from spark 1.3 libraries
Use Hadoop 2.6

There is no need to mess around with the libraries. I will try to unset my CLASSPATH and reset again and try again


Thanks,


Mich Talebzadeh

Sybase ASE 15 Gold Medal Award 2008
A Winning Strategy: Running the most Critical Financial Data on ASE 15
http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 
co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4
Publications due shortly:
Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8
Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

http://talebzadehmich.wordpress.com

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: 03 December 2015 18:45
To: Mich Talebzadeh <mi...@peridale.co.uk>
Cc: user@hive.apache.org; user <us...@spark.apache.org>
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> hduser@rhes564::/usr/lib/spark/logs> hive --version
> SLF4J: Found binding in
> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org
> /slf4j/impl/StaticLoggerBinder.class]

As I suggested before, you have Spark's assembly in the Hive classpath. That's not the way to configure hive-on-spark; if the documentation you're following tells you to do that, it's wrong.

(And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark should work fine with Spark 1.3 if it's configured correctly. You really don't want to be overriding Hive classes with the ones shipped in the Spark assembly, regardless of the version of Spark being used.)

--
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail: user-help@spark.apache.org


Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> hduser@rhes564::/usr/lib/spark/logs> hive --version
> SLF4J: Found binding in
> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

As I suggested before, you have Spark's assembly in the Hive
classpath. That's not the way to configure hive-on-spark; if the
documentation you're following tells you to do that, it's wrong.

(And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark
should work fine with Spark 1.3 if it's configured correctly. You
really don't want to be overriding Hive classes with the ones shipped
in the Spark assembly, regardless of the version of Spark being used.)

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> hduser@rhes564::/usr/lib/spark/logs> hive --version
> SLF4J: Found binding in
> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

As I suggested before, you have Spark's assembly in the Hive
classpath. That's not the way to configure hive-on-spark; if the
documentation you're following tells you to do that, it's wrong.

(And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark
should work fine with Spark 1.3 if it's configured correctly. You
really don't want to be overriding Hive classes with the ones shipped
in the Spark assembly, regardless of the version of Spark being used.)

-- 
Marcelo

Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Ted Yu <yu...@gmail.com>.
Mich:
Please use Spark 1.5.0+ to work with Hive 1.2.1

Cheers

On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <mi...@peridale.co.uk>
wrote:

> Hi,
>
>
>
> These are my stack for now
>
>
>
> 1.    Spark version 1.3
>
> 2.    Hive version 1.2.1
>
> 3.    Hadoop version 2.6
>
>
>
> So I am using hive version 1.2.1
>
>
>
> hduser@rhes564::/usr/lib/spark/logs> hive --version
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> *Hive 1.2.1*
>
> Subversion git://localhost.localdomain/home/sush/dev/hive.git -r
> 243e7c1ac39cb7ac8b65c5bc6988f5cc3162f558
>
> Compiled by sush on Fri Jun 19 02:03:48 PDT 2015
>
> From source with checksum ab480aca41b24a9c3751b8c023338231
>
>
>
>
>
> Thanks,
>
>
>
>
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
> *From:* Furcy Pin [mailto:furcy.pin@flaminem.com]
> *Sent:* 03 December 2015 18:22
> *To:* user@hive.apache.org
> *Cc:* user <us...@spark.apache.org>
> *Subject:* Re: Any clue on this error, Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
>
>
> The field SPARK_RPC_CLIENT_CONNECT_TIMEOUT seems to have been added to
> Hive in the 1.1.0 release
>
>
>
>
> https://github.com/apache/hive/blob/release-1.1.0/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
>
>
>
> Are you using an older version of Hive somewhere?
>
>
>
>
>
> On Thu, Dec 3, 2015 at 7:15 PM, Mich Talebzadeh <mi...@peridale.co.uk>
> wrote:
>
> Thanks I tried all L
>
>
>
> I am trying to make Hive use Spark and apparently Hive can use version 1.3
> of Spark as execution engine. Frankly I don’t know why this is not working!
>
>
>
> Mich Talebzadeh
>
>
>
> *Sybase ASE 15 Gold Medal Award 2008*
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
> 15", ISBN 978-0-9563693-0-7*.
>
> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4*
>
> *Publications due shortly:*
>
> *Complex Event Processing in Heterogeneous Environments*, ISBN:
> 978-0-9563693-3-8
>
> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
> *From:* Furcy Pin [mailto:furcy.pin@flaminem.com]
> *Sent:* 03 December 2015 18:07
> *To:* user@hive.apache.org
> *Cc:* user@spark.apache.org
> *Subject:* Re: Any clue on this error, Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
>
>
> maybe you compile and run against different versions of spark?
>
>
>
> On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mi...@peridale.co.uk>
> wrote:
>
> Trying to run Hive on Spark 1.3 engine, I get
>
>
>
> conf hive.spark.client.channel.log.level=null --conf
> hive.spark.client.rpc.max.size=52428800 --conf
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
> assembly has been built with Hive, including Datanucleus jars on classpath
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property:
> hive.spark.client.server.connect.timeout=90000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
> 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577
>
> *15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:
> Exception in thread "main" java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT*
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> java.lang.reflect.Method.invoke(Method.java:606)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> Any clues?
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> *Sybase ASE 15 Gold Medal Award 2008*
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
> 15", ISBN 978-0-9563693-0-7*.
>
> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4*
>
> *Publications due shortly:*
>
> *Complex Event Processing in Heterogeneous Environments*, ISBN:
> 978-0-9563693-3-8
>
> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
>
>
>
>

RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Hi,

 

These are my stack for now

 

1.    Spark version 1.3

2.    Hive version 1.2.1

3.    Hadoop version 2.6

 

So I am using hive version 1.2.1

 

hduser@rhes564::/usr/lib/spark/logs> hive --version

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Hive 1.2.1

Subversion git://localhost.localdomain/home/sush/dev/hive.git -r 243e7c1ac39cb7ac8b65c5bc6988f5cc3162f558

Compiled by sush on Fri Jun 19 02:03:48 PDT 2015

>From source with checksum ab480aca41b24a9c3751b8c023338231

 

 

Thanks,

 

 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Furcy Pin [mailto:furcy.pin@flaminem.com] 
Sent: 03 December 2015 18:22
To: user@hive.apache.org
Cc: user <us...@spark.apache.org>
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

The field SPARK_RPC_CLIENT_CONNECT_TIMEOUT seems to have been added to Hive in the 1.1.0 release

 

https://github.com/apache/hive/blob/release-1.1.0/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java

 

Are you using an older version of Hive somewhere?

 

 

On Thu, Dec 3, 2015 at 7:15 PM, Mich Talebzadeh <mich@peridale.co.uk <ma...@peridale.co.uk> > wrote:

Thanks I tried all :(

 

I am trying to make Hive use Spark and apparently Hive can use version 1.3 of Spark as execution engine. Frankly I don’t know why this is not working!

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Furcy Pin [mailto:furcy.pin@flaminem.com <ma...@flaminem.com> ] 
Sent: 03 December 2015 18:07
To: user@hive.apache.org <ma...@hive.apache.org> 
Cc: user@spark.apache.org <ma...@spark.apache.org> 
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

maybe you compile and run against different versions of spark?

 

On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mich@peridale.co.uk <ma...@peridale.co.uk> > wrote:

Trying to run Hive on Spark 1.3 engine, I get

 

conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256

15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark assembly has been built with Hive, including Datanucleus jars on classpath

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

Any clues?

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 

 


RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Hi,

 

These are my stack for now

 

1.    Spark version 1.3

2.    Hive version 1.2.1

3.    Hadoop version 2.6

 

So I am using hive version 1.2.1

 

hduser@rhes564::/usr/lib/spark/logs> hive --version

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Hive 1.2.1

Subversion git://localhost.localdomain/home/sush/dev/hive.git -r 243e7c1ac39cb7ac8b65c5bc6988f5cc3162f558

Compiled by sush on Fri Jun 19 02:03:48 PDT 2015

>From source with checksum ab480aca41b24a9c3751b8c023338231

 

 

Thanks,

 

 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Furcy Pin [mailto:furcy.pin@flaminem.com] 
Sent: 03 December 2015 18:22
To: user@hive.apache.org
Cc: user <us...@spark.apache.org>
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

The field SPARK_RPC_CLIENT_CONNECT_TIMEOUT seems to have been added to Hive in the 1.1.0 release

 

https://github.com/apache/hive/blob/release-1.1.0/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java

 

Are you using an older version of Hive somewhere?

 

 

On Thu, Dec 3, 2015 at 7:15 PM, Mich Talebzadeh <mich@peridale.co.uk <ma...@peridale.co.uk> > wrote:

Thanks I tried all :(

 

I am trying to make Hive use Spark and apparently Hive can use version 1.3 of Spark as execution engine. Frankly I don’t know why this is not working!

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Furcy Pin [mailto:furcy.pin@flaminem.com <ma...@flaminem.com> ] 
Sent: 03 December 2015 18:07
To: user@hive.apache.org <ma...@hive.apache.org> 
Cc: user@spark.apache.org <ma...@spark.apache.org> 
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

maybe you compile and run against different versions of spark?

 

On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mich@peridale.co.uk <ma...@peridale.co.uk> > wrote:

Trying to run Hive on Spark 1.3 engine, I get

 

conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256

15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark assembly has been built with Hive, including Datanucleus jars on classpath

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

Any clues?

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 

 


Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Furcy Pin <fu...@flaminem.com>.
The field SPARK_RPC_CLIENT_CONNECT_TIMEOUT seems to have been added to Hive
in the 1.1.0 release

https://github.com/apache/hive/blob/release-1.1.0/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java

Are you using an older version of Hive somewhere?


On Thu, Dec 3, 2015 at 7:15 PM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> Thanks I tried all L
>
>
>
> I am trying to make Hive use Spark and apparently Hive can use version 1.3
> of Spark as execution engine. Frankly I don’t know why this is not working!
>
>
>
> Mich Talebzadeh
>
>
>
> *Sybase ASE 15 Gold Medal Award 2008*
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
> 15", ISBN 978-0-9563693-0-7*.
>
> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4*
>
> *Publications due shortly:*
>
> *Complex Event Processing in Heterogeneous Environments*, ISBN:
> 978-0-9563693-3-8
>
> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
> *From:* Furcy Pin [mailto:furcy.pin@flaminem.com]
> *Sent:* 03 December 2015 18:07
> *To:* user@hive.apache.org
> *Cc:* user@spark.apache.org
> *Subject:* Re: Any clue on this error, Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
>
>
> maybe you compile and run against different versions of spark?
>
>
>
> On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mi...@peridale.co.uk>
> wrote:
>
> Trying to run Hive on Spark 1.3 engine, I get
>
>
>
> conf hive.spark.client.channel.log.level=null --conf
> hive.spark.client.rpc.max.size=52428800 --conf
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
> assembly has been built with Hive, including Datanucleus jars on classpath
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property:
> hive.spark.client.server.connect.timeout=90000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
> 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577
>
> *15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:
> Exception in thread "main" java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT*
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> java.lang.reflect.Method.invoke(Method.java:606)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> Any clues?
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> *Sybase ASE 15 Gold Medal Award 2008*
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
> 15", ISBN 978-0-9563693-0-7*.
>
> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4*
>
> *Publications due shortly:*
>
> *Complex Event Processing in Heterogeneous Environments*, ISBN:
> 978-0-9563693-3-8
>
> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
>
>

RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Thanks I tried all :(

 

I am trying to make Hive use Spark and apparently Hive can use version 1.3 of Spark as execution engine. Frankly I don’t know why this is not working!

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Furcy Pin [mailto:furcy.pin@flaminem.com] 
Sent: 03 December 2015 18:07
To: user@hive.apache.org
Cc: user@spark.apache.org
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

maybe you compile and run against different versions of spark?

 

On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mich@peridale.co.uk <ma...@peridale.co.uk> > wrote:

Trying to run Hive on Spark 1.3 engine, I get

 

conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256

15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark assembly has been built with Hive, including Datanucleus jars on classpath

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

Any clues?

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 


RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Thanks I tried all :(

 

I am trying to make Hive use Spark and apparently Hive can use version 1.3 of Spark as execution engine. Frankly I don’t know why this is not working!

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Furcy Pin [mailto:furcy.pin@flaminem.com] 
Sent: 03 December 2015 18:07
To: user@hive.apache.org
Cc: user@spark.apache.org
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

maybe you compile and run against different versions of spark?

 

On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mich@peridale.co.uk <ma...@peridale.co.uk> > wrote:

Trying to run Hive on Spark 1.3 engine, I get

 

conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256

15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark assembly has been built with Hive, including Datanucleus jars on classpath

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

Any clues?

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 


Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Furcy Pin <fu...@flaminem.com>.
maybe you compile and run against different versions of spark?

On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> Trying to run Hive on Spark 1.3 engine, I get
>
>
>
> conf hive.spark.client.channel.log.level=null --conf
> hive.spark.client.rpc.max.size=52428800 --conf
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
> assembly has been built with Hive, including Datanucleus jars on classpath
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property:
> hive.spark.client.server.connect.timeout=90000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
> 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577
>
> *15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:
> Exception in thread "main" java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT*
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> java.lang.reflect.Method.invoke(Method.java:606)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> Any clues?
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> *Sybase ASE 15 Gold Medal Award 2008*
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
> 15", ISBN 978-0-9563693-0-7*.
>
> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4*
>
> *Publications due shortly:*
>
> *Complex Event Processing in Heterogeneous Environments*, ISBN:
> 978-0-9563693-3-8
>
> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>

Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

Posted by Marcelo Vanzin <va...@cloudera.com>.
(bcc: user@spark, since this is Hive code.)

You're probably including unneeded Spark jars in Hive's classpath
somehow. Either the whole assembly or spark-hive, both of which will
contain Hive classes, and in this case contain old versions that
conflict with the version of Hive you're running.

On Thu, Dec 3, 2015 at 9:54 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:
> Trying to run Hive on Spark 1.3 engine, I get
>
>
>
> conf hive.spark.client.channel.log.level=null --conf
> hive.spark.client.rpc.max.size=52428800 --conf
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark
> assembly has been built with Hive, including Datanucleus jars on classpath
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:
> Ignoring non-spark config property:
> hive.spark.client.server.connect.timeout=90000
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03
> 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception
> in thread "main" java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> java.lang.reflect.Method.invoke(Method.java:606)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>
> 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> Any clues?
>
>
>
>
>
> Mich Talebzadeh
>
>
>
> Sybase ASE 15 Gold Medal Award 2008
>
> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>
> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>
> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15",
> ISBN 978-0-9563693-0-7.
>
> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
> 978-0-9759693-0-4
>
> Publications due shortly:
>
> Complex Event Processing in Heterogeneous Environments, ISBN:
> 978-0-9563693-3-8
>
> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume
> one out shortly
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org