You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Raymond Honderdors <Ra...@sizmek.com> on 2016/04/05 12:44:28 UTC

Build with Thrift Server & Scala 2.11

Is anyone looking into this one, Build with Thrift Server & Scala 2.11?
I9f so when can we expect it

Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com<ma...@sizmek.com>
T +972.7325.3569
Herzliya


[Read More]<http://feeds.feedburner.com/~r/sizmek-blog/~6/1>

[http://www.sizmek.com/Sizmek.png]<http://www.sizmek.com/>

RE: Build with Thrift Server & Scala 2.11

Posted by Raymond Honderdors <Ra...@sizmek.com>.
Here is the error after build with scala 2.10
“
Spark Command: /usr/lib/jvm/java-1.8.0/bin/java -cp /home/raymond.honderdors/Documents/IdeaProjects/spark/conf/:/home/raymond.honderdors/Documents/IdeaProjects/spark/assembly/target/scala-2.10/jars/* -Xms5g -Xmx5g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal
========================================
Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class
                at java.lang.ClassLoader.defineClass1(Native Method)
                at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
                at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
                at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
                at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
                at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
                at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
                at java.security.AccessController.doPrivileged(Native Method)
                at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
                at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
                at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
                at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
                at java.lang.Class.getDeclaredMethods0(Native Method)
                at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
                at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
                at java.lang.Class.getMethod0(Class.java:3018)
                at java.lang.Class.getMethod(Class.java:1784)
                at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:710)
                at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
                at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
                at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
                at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
”



Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com<ma...@sizmek.com>
T +972.7325.3569
Herzliya

From: Raymond Honderdors [mailto:Raymond.Honderdors@sizmek.com]
Sent: Tuesday, April 05, 2016 4:23 PM
To: Reynold Xin <rx...@databricks.com>
Cc: dev@spark.apache.org
Subject: RE: Build with Thrift Server & Scala 2.11

I can see that the build is successful
(-Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver –Dscala-2.11 -DskipTests clean package)

the documents page it still says that
“
Building With Hive and JDBC Support
To enable Hive integration for Spark SQL along with its JDBC server and CLI, add the -Phive and Phive-thriftserver profiles to your existing build options. By default Spark will build with Hive 0.13.1 bindings.

# Apache Hadoop 2.4.X with Hive 13 support
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package
Building for Scala 2.11
To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 property:

./dev/change-scala-version.sh 2.11
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
Spark does not yet support its JDBC component for Scala 2.11.
”
Source : http://spark.apache.org/docs/latest/building-spark.html

When I try to start the thrift server I get the following error:
“
16/04/05 16:09:11 INFO BlockManagerMaster: Registered BlockManager
16/04/05 16:09:12 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode
                at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
                at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
                at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
                at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
                at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
                at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
                at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
                at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
                at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1667)
                at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:67)
                at org.apache.spark.SparkContext.<init>(SparkContext.scala:517)
                at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:77)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:726)
                at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
                at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
                at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
                at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.UnknownHostException: namenode
                ... 26 more
16/04/05 16:09:12 INFO SparkUI: Stopped Spark web UI at http://10.10.182.195:4040
16/04/05 16:09:12 INFO SparkDeploySchedulerBackend: Shutting down all executors
”



Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com<ma...@sizmek.com>
T +972.7325.3569
Herzliya

From: Reynold Xin [mailto:rxin@databricks.com]
Sent: Tuesday, April 05, 2016 3:57 PM
To: Raymond Honderdors <Ra...@sizmek.com>>
Cc: dev@spark.apache.org<ma...@spark.apache.org>
Subject: Re: Build with Thrift Server & Scala 2.11

What do you mean? The Jenkins build for Spark uses 2.11 and also builds the thrift server.

On Tuesday, April 5, 2016, Raymond Honderdors <Ra...@sizmek.com>> wrote:
Is anyone looking into this one, Build with Thrift Server & Scala 2.11?
I9f so when can we expect it

Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com<javascript:_e(%7B%7D,'cvml','raymond.honderdors@sizmek.com');>
T +972.7325.3569
Herzliya


[Read More]<http://feeds.feedburner.com/~r/sizmek-blog/~6/1>

[http://www.sizmek.com/Sizmek.png]<http://www.sizmek.com/>

Re: Build with Thrift Server & Scala 2.11

Posted by Raymond Honderdors <Ra...@sizmek.com>.
I did a check of that i could not find that in any of the config files

I also used config files that work with 1.6.1

Sent from Outlook Mobile<https://aka.ms/blhgte>



On Tue, Apr 5, 2016 at 9:22 AM -0700, "Ted Yu" <yu...@gmail.com>> wrote:

Raymond:

Did "namenode" appear in any of the Spark config files ?

BTW Scala 2.11 is used by the default build.

On Tue, Apr 5, 2016 at 6:22 AM, Raymond Honderdors <Ra...@sizmek.com>> wrote:
I can see that the build is successful
(-Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -Dscala-2.11 -DskipTests clean package)

the documents page it still says that
"
Building With Hive and JDBC Support
To enable Hive integration for Spark SQL along with its JDBC server and CLI, add the -Phive and Phive-thriftserver profiles to your existing build options. By default Spark will build with Hive 0.13.1 bindings.

# Apache Hadoop 2.4.X with Hive 13 support
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package
Building for Scala 2.11
To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 property:

./dev/change-scala-version.sh 2.11
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
Spark does not yet support its JDBC component for Scala 2.11.
"
Source : http://spark.apache.org/docs/latest/building-spark.html

When I try to start the thrift server I get the following error:
"
16/04/05 16:09:11 INFO BlockManagerMaster: Registered BlockManager
16/04/05 16:09:12 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode
                at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
                at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
                at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
                at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
                at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
                at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
                at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
                at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
                at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1667)
                at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:67)
                at org.apache.spark.SparkContext.<init>(SparkContext.scala:517)
                at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:77)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:726)
                at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
                at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
                at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
                at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.UnknownHostException: namenode
                ... 26 more
16/04/05 16:09:12 INFO SparkUI: Stopped Spark web UI at http://10.10.182.195:4040
16/04/05 16:09:12 INFO SparkDeploySchedulerBackend: Shutting down all executors
"



Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com<ma...@sizmek.com>
T +972.7325.3569
Herzliya

From: Reynold Xin [mailto:rxin@databricks.com<ma...@databricks.com>]
Sent: Tuesday, April 05, 2016 3:57 PM
To: Raymond Honderdors <Ra...@sizmek.com>>
Cc: dev@spark.apache.org<ma...@spark.apache.org>
Subject: Re: Build with Thrift Server & Scala 2.11

What do you mean? The Jenkins build for Spark uses 2.11 and also builds the thrift server.

On Tuesday, April 5, 2016, Raymond Honderdors <Ra...@sizmek.com>> wrote:
Is anyone looking into this one, Build with Thrift Server & Scala 2.11?
I9f so when can we expect it

Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com
T +972.7325.3569
Herzliya


[Read More]<http://feeds.feedburner.com/~r/sizmek-blog/~6/1>

[http://www.sizmek.com/Sizmek.png]<http://www.sizmek.com/>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>
For additional commands, e-mail: dev-help@spark.apache.org<ma...@spark.apache.org>


Re: Build with Thrift Server & Scala 2.11

Posted by Ted Yu <yu...@gmail.com>.
Raymond:

Did "namenode" appear in any of the Spark config files ?

BTW Scala 2.11 is used by the default build.

On Tue, Apr 5, 2016 at 6:22 AM, Raymond Honderdors <
Raymond.Honderdors@sizmek.com> wrote:

> I can see that the build is successful
>
> (-Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
> –Dscala-2.11 -DskipTests clean package)
>
>
>
> the documents page it still says that
>
> “
>
> Building With Hive and JDBC Support
>
> To enable Hive integration for Spark SQL along with its JDBC server and
> CLI, add the -Phive and Phive-thriftserver profiles to your existing build
> options. By default Spark will build with Hive 0.13.1 bindings.
>
>
>
> # Apache Hadoop 2.4.X with Hive 13 support
>
> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver
> -DskipTests clean package
>
> Building for Scala 2.11
>
> To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11
> property:
>
>
>
> ./dev/change-scala-version.sh 2.11
>
> mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
>
> Spark does not yet support its JDBC component for Scala 2.11.
>
> ”
>
> Source : http://spark.apache.org/docs/latest/building-spark.html
>
>
>
> When I try to start the thrift server I get the following error:
>
> “
>
> 16/04/05 16:09:11 INFO BlockManagerMaster: Registered BlockManager
>
> 16/04/05 16:09:12 ERROR SparkContext: Error initializing SparkContext.
>
> java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode
>
>                 at
> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
>
>                 at
> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
>
>                 at
> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
>
>                 at
> org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
>
>                 at
> org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
>
>                 at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
>
>                 at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
>
>                 at
> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>
>                 at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
>
>                 at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
>
>                 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
>
>                 at
> org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1667)
>
>                 at
> org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:67)
>
>                 at
> org.apache.spark.SparkContext.<init>(SparkContext.scala:517)
>
>                 at
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
>
>                 at
> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:77)
>
>                 at
> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
>
>                 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>                 at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>                 at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>                 at java.lang.reflect.Method.invoke(Method.java:498)
>
>                 at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:726)
>
>                 at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
>
>                 at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
>
>                 at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
>
>                 at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Caused by: java.net.UnknownHostException: namenode
>
>                 ... 26 more
>
> 16/04/05 16:09:12 INFO SparkUI: Stopped Spark web UI at
> http://10.10.182.195:4040
>
> 16/04/05 16:09:12 INFO SparkDeploySchedulerBackend: Shutting down all
> executors
>
> ”
>
>
>
>
>
>
>
> *Raymond Honderdors *
>
> *Team Lead Analytics BI*
>
> *Business Intelligence Developer *
>
> *raymond.honderdors@sizmek.com <ra...@sizmek.com> *
>
> *T +972.7325.3569*
>
> *Herzliya*
>
>
>
> *From:* Reynold Xin [mailto:rxin@databricks.com]
> *Sent:* Tuesday, April 05, 2016 3:57 PM
> *To:* Raymond Honderdors <Ra...@sizmek.com>
> *Cc:* dev@spark.apache.org
> *Subject:* Re: Build with Thrift Server & Scala 2.11
>
>
>
> What do you mean? The Jenkins build for Spark uses 2.11 and also builds
> the thrift server.
>
> On Tuesday, April 5, 2016, Raymond Honderdors <
> Raymond.Honderdors@sizmek.com> wrote:
>
> Is anyone looking into this one, Build with Thrift Server & Scala 2.11?
>
> I9f so when can we expect it
>
>
>
> *Raymond Honderdors *
>
> *Team Lead Analytics BI*
>
> *Business Intelligence Developer *
>
> *raymond.honderdors@sizmek.com *
>
> *T +972.7325.3569*
>
> *Herzliya*
>
>
>
> [image: Read More] <http://feeds.feedburner.com/~r/sizmek-blog/~6/1>
>
> <http://www.sizmek.com/>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>

RE: Build with Thrift Server & Scala 2.11

Posted by Raymond Honderdors <Ra...@sizmek.com>.
I can see that the build is successful
(-Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver –Dscala-2.11 -DskipTests clean package)

the documents page it still says that
“
Building With Hive and JDBC Support
To enable Hive integration for Spark SQL along with its JDBC server and CLI, add the -Phive and Phive-thriftserver profiles to your existing build options. By default Spark will build with Hive 0.13.1 bindings.

# Apache Hadoop 2.4.X with Hive 13 support
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package
Building for Scala 2.11
To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 property:

./dev/change-scala-version.sh 2.11
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
Spark does not yet support its JDBC component for Scala 2.11.
”
Source : http://spark.apache.org/docs/latest/building-spark.html

When I try to start the thrift server I get the following error:
“
16/04/05 16:09:11 INFO BlockManagerMaster: Registered BlockManager
16/04/05 16:09:12 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode
                at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
                at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
                at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
                at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
                at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
                at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
                at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
                at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
                at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1667)
                at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:67)
                at org.apache.spark.SparkContext.<init>(SparkContext.scala:517)
                at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:77)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:726)
                at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
                at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
                at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
                at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.UnknownHostException: namenode
                ... 26 more
16/04/05 16:09:12 INFO SparkUI: Stopped Spark web UI at http://10.10.182.195:4040
16/04/05 16:09:12 INFO SparkDeploySchedulerBackend: Shutting down all executors
”



Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com<ma...@sizmek.com>
T +972.7325.3569
Herzliya

From: Reynold Xin [mailto:rxin@databricks.com]
Sent: Tuesday, April 05, 2016 3:57 PM
To: Raymond Honderdors <Ra...@sizmek.com>
Cc: dev@spark.apache.org
Subject: Re: Build with Thrift Server & Scala 2.11

What do you mean? The Jenkins build for Spark uses 2.11 and also builds the thrift server.

On Tuesday, April 5, 2016, Raymond Honderdors <Ra...@sizmek.com>> wrote:
Is anyone looking into this one, Build with Thrift Server & Scala 2.11?
I9f so when can we expect it

Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer
raymond.honderdors@sizmek.com<javascript:_e(%7B%7D,'cvml','raymond.honderdors@sizmek.com');>
T +972.7325.3569
Herzliya


[Read More]<http://feeds.feedburner.com/~r/sizmek-blog/~6/1>

[http://www.sizmek.com/Sizmek.png]<http://www.sizmek.com/>

Re: Build with Thrift Server & Scala 2.11

Posted by Reynold Xin <rx...@databricks.com>.
What do you mean? The Jenkins build for Spark uses 2.11 and also builds the
thrift server.

On Tuesday, April 5, 2016, Raymond Honderdors <Ra...@sizmek.com>
wrote:

> Is anyone looking into this one, Build with Thrift Server & Scala 2.11?
>
> I9f so when can we expect it
>
>
>
> *Raymond Honderdors *
>
> *Team Lead Analytics BI*
>
> *Business Intelligence Developer *
>
> *raymond.honderdors@sizmek.com
> <javascript:_e(%7B%7D,'cvml','raymond.honderdors@sizmek.com');> *
>
> *T +972.7325.3569*
>
> *Herzliya*
>
>
>
> [image: Read More] <http://feeds.feedburner.com/~r/sizmek-blog/~6/1>
>
> <http://www.sizmek.com/>
>