You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Ruslan Dautkhanov <da...@gmail.com> on 2016/11/26 16:43:21 UTC
SparkInterpreter.java[getSQLContext_1]:256 - Can't create
HiveContext. Fallback to SQLContext
Getting
SparkInterpreter.java[getSQLContext_1]:256 - Can't create HiveContext.
Fallback to SQLContext
See full stack [1] below.
Java 7
Zeppelin 0.6.2
CDH 5.8.3
Spark 1.6
How to fix this?
Thank you.
[1]
WARN [2016-11-26 09:38:39,028] ({pool-2-thread-2}
SparkInterpreter.java[getSQLContext_1]:256) - Can't create HiveContext.
Fallback to SQLContext
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.zeppelin.spark.SparkInterpreter.getSQLContext_1(SparkInterpreter.java:251)
at
org.apache.zeppelin.spark.SparkInterpreter.getSQLContext(SparkInterpreter.java:228)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:829)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at
org.apache.zeppelin.spark.PySparkInterpreter.getSparkInterpreter(PySparkInterpreter.java:512)
at
org.apache.zeppelin.spark.PySparkInterpreter.createGatewayServerAndStartScript(PySparkInterpreter.java:181)
at
org.apache.zeppelin.spark.PySparkInterpreter.open(PySparkInterpreter.java:159)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError:
com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V
at
com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
at
com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:503)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
at com.sun.proxy.$Proxy23.close(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2016)
at com.sun.proxy.$Proxy23.close(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:332)
at org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.java:140)
at org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:160)
at
org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(Hive.java:301)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:271)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:248)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
at
org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
at
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:220)
at
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:210)
at
org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:464)
at
org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:463)
at
org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
at
org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
at
org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
... 23 more
Re: SparkInterpreter.java[getSQLContext_1]:256 - Can't create
HiveContext. Fallback to SQLContext
Posted by Ruslan Dautkhanov <da...@gmail.com>.
CDH 5.8 has Hive 1.1 + a bunch of patches Cloudera has backported.
--
Ruslan Dautkhanov
On Sun, Nov 27, 2016 at 5:30 AM, Jianfeng (Jeff) Zhang <
jzhang@hortonworks.com> wrote:
>
> This seems jar version mismatch, what hive version do you use ?
>
>
> Best Regard,
> Jeff Zhang
>
>
> From: Ruslan Dautkhanov <da...@gmail.com>
> Reply-To: "users@zeppelin.apache.org" <us...@zeppelin.apache.org>
> Date: Sunday, November 27, 2016 at 12:43 AM
> To: users <us...@zeppelin.apache.org>
> Subject: SparkInterpreter.java[getSQLContext_1]:256 - Can't create
> HiveContext. Fallback to SQLContext
>
> Getting
> SparkInterpreter.java[getSQLContext_1]:256 - Can't create HiveContext.
> Fallback to SQLContext
> See full stack [1] below.
>
> Java 7
> Zeppelin 0.6.2
> CDH 5.8.3
> Spark 1.6
>
> How to fix this?
>
> Thank you.
>
>
>
> [1]
>
> WARN [2016-11-26 09:38:39,028] ({pool-2-thread-2} SparkInterpreter.java[getSQLContext_1]:256)
> - Can't create HiveContext. Fallback to SQLContext
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:57)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at org.apache.zeppelin.spark.SparkInterpreter.getSQLContext_1(
> SparkInterpreter.java:251)
> at org.apache.zeppelin.spark.SparkInterpreter.getSQLContext(
> SparkInterpreter.java:228)
> at org.apache.zeppelin.spark.SparkInterpreter.open(
> SparkInterpreter.java:829)
> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(
> LazyOpenInterpreter.java:69)
> at org.apache.zeppelin.spark.PySparkInterpreter.
> getSparkInterpreter(PySparkInterpreter.java:512)
> at org.apache.zeppelin.spark.PySparkInterpreter.
> createGatewayServerAndStartScript(PySparkInterpreter.java:181)
> at org.apache.zeppelin.spark.PySparkInterpreter.open(
> PySparkInterpreter.java:159)
> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(
> LazyOpenInterpreter.java:69)
> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(
> LazyOpenInterpreter.java:93)
> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$
> InterpretJob.jobRun(RemoteInterpreterServer.java:341)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(
> FIFOScheduler.java:139)
> at java.util.concurrent.Executors$RunnableAdapter.
> call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at java.util.concurrent.ScheduledThreadPoolExecutor$
> ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
> at java.util.concurrent.ScheduledThreadPoolExecutor$
> ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NoSuchMethodError: com.facebook.fb303.
> FacebookService$Client.sendBaseOneway(Ljava/lang/
> String;Lorg/apache/thrift/TBase;)V
> at com.facebook.fb303.FacebookService$Client.send_
> shutdown(FacebookService.java:436)
> at com.facebook.fb303.FacebookService$Client.
> shutdown(FacebookService.java:430)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> close(HiveMetaStoreClient.java:503)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> invoke(RetryingMetaStoreClient.java:105)
> at com.sun.proxy.$Proxy23.close(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$
> SynchronizedHandler.invoke(HiveMetaStoreClient.java:2016)
> at com.sun.proxy.$Proxy23.close(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:332)
> at org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.
> java:140)
> at org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:160)
> at org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(
> Hive.java:301)
> at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:271)
> at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:248)
> at org.apache.hadoop.hive.ql.session.SessionState.start(
> SessionState.java:513)
> at org.apache.spark.sql.hive.client.ClientWrapper.<init>(
> ClientWrapper.scala:194)
> at org.apache.spark.sql.hive.client.IsolatedClientLoader.
> createClient(IsolatedClientLoader.scala:238)
> at org.apache.spark.sql.hive.HiveContext.executionHive$
> lzycompute(HiveContext.scala:220)
> at org.apache.spark.sql.hive.HiveContext.executionHive(
> HiveContext.scala:210)
> at org.apache.spark.sql.hive.HiveContext.functionRegistry$
> lzycompute(HiveContext.scala:464)
> at org.apache.spark.sql.hive.HiveContext.functionRegistry(
> HiveContext.scala:463)
> at org.apache.spark.sql.UDFRegistration.<init>(
> UDFRegistration.scala:40)
> at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
> at org.apache.spark.sql.hive.HiveContext.<init>(
> HiveContext.scala:90)
> at org.apache.spark.sql.hive.HiveContext.<init>(
> HiveContext.scala:101)
> ... 23 more
>
>
Re: SparkInterpreter.java[getSQLContext_1]:256 - Can't create
HiveContext. Fallback to SQLContext
Posted by "Jianfeng (Jeff) Zhang" <jz...@hortonworks.com>.
This seems jar version mismatch, what hive version do you use ?
Best Regard,
Jeff Zhang
From: Ruslan Dautkhanov <da...@gmail.com>>
Reply-To: "users@zeppelin.apache.org<ma...@zeppelin.apache.org>" <us...@zeppelin.apache.org>>
Date: Sunday, November 27, 2016 at 12:43 AM
To: users <us...@zeppelin.apache.org>>
Subject: SparkInterpreter.java[getSQLContext_1]:256 - Can't create HiveContext. Fallback to SQLContext
Getting
SparkInterpreter.java[getSQLContext_1]:256 - Can't create HiveContext. Fallback to SQLContext
See full stack [1] below.
Java 7
Zeppelin 0.6.2
CDH 5.8.3
Spark 1.6
How to fix this?
Thank you.
[1]
WARN [2016-11-26 09:38:39,028] ({pool-2-thread-2} SparkInterpreter.java[getSQLContext_1]:256) - Can't create HiveContext. Fallback to SQLContext
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.zeppelin.spark.SparkInterpreter.getSQLContext_1(SparkInterpreter.java:251)
at org.apache.zeppelin.spark.SparkInterpreter.getSQLContext(SparkInterpreter.java:228)
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:829)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.spark.PySparkInterpreter.getSparkInterpreter(PySparkInterpreter.java:512)
at org.apache.zeppelin.spark.PySparkInterpreter.createGatewayServerAndStartScript(PySparkInterpreter.java:181)
at org.apache.zeppelin.spark.PySparkInterpreter.open(PySparkInterpreter.java:159)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V
at com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
at com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:503)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
at com.sun.proxy.$Proxy23.close(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2016)
at com.sun.proxy.$Proxy23.close(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:332)
at org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.java:140)
at org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:160)
at org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(Hive.java:301)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:271)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:248)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:220)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:210)
at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:464)
at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:463)
at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
... 23 more