You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Amith sha <am...@gmail.com> on 2015/03/13 12:51:18 UTC

Hive on Spark

Hi all,


Recently i have configured Spark 1.2.0 and my environment is hadoop
2.6.0 hive 1.1.0 Here i have tried hive on Spark while executing
insert into i am getting the following g error.

Query ID = hadoop2_20150313162828_8764adad-a8e4-49da-9ef5-35e4ebd6bc63
Total jobs = 1
Launching Job 1 out of 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
Failed to execute spark task, with exception
'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create
spark client.)'
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.spark.SparkTask



Have added the spark-assembly jar in hive lib
And also in hive console using the command add jar followed by the  steps

set spark.home=/opt/spark-1.2.1/;


add jar /opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar;



set hive.execution.engine=spark;


set spark.master=spark://xxxxxxx:7077;


set spark.eventLog.enabled=true;


set spark.executor.memory=512m;


set spark.serializer=org.apache.spark.serializer.KryoSerializer;

Can anyone suggest!!!!



Thanks & Regards
Amithsha

Re: Hive on Spark

Posted by Amith sha <am...@gmail.com>.
hi Xuefu

i am running spark on local mode and my spark master log is

15/03/20 10:52:52 INFO AbstractConnector: Started
SelectChannelConnector@nn01:6066
15/03/20 10:52:52 INFO Utils: Successfully started service on port 6066.
15/03/20 10:52:52 INFO StandaloneRestServer: Started REST server for
submitting applications on port 6066
15/03/20 10:52:52 INFO Master: Starting Spark master at spark://nn01:7077
15/03/20 10:52:52 INFO Master: Running Spark version 1.3.0
15/03/20 10:52:52 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/20 10:52:52 INFO AbstractConnector: Started
SelectChannelConnector@0.0.0.0:8080
15/03/20 10:52:52 INFO Utils: Successfully started service 'MasterUI'
on port 8080.
15/03/20 10:52:52 INFO MasterWebUI: Started MasterWebUI at http://nn01:8080
15/03/20 10:52:53 INFO Master: I have been elected leader! New state: ALIVE


And my Hive log

2015-03-20 10:53:12,871 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2015-03-20 10:53:12,871 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
2015-03-20 10:53:12,871 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2015-03-20 10:53:12,872 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
java.lang.reflect.Method.invoke(Method.java:606)
2015-03-20 10:53:12,872 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
2015-03-20 10:53:12,872 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
2015-03-20 10:53:12,872 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
2015-03-20 10:53:12,873 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
2015-03-20 10:53:12,873 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2015-03-20 10:53:13,412 WARN  [Driver]: client.SparkClientImpl
(SparkClientImpl.java:run(388)) - Child process exited with code 1.
2015-03-20 10:53:28,202 WARN  [Thread-42]: client.SparkClientImpl
(SparkClientImpl.java:<init>(96)) - Error while waiting for client to
connect.
java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
    at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
    at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
    at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
    at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
Caused by: java.util.concurrent.TimeoutException: Timed out waiting
for client connection.
    at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
    at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
    at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at java.lang.Thread.run(Thread.java:744)
2015-03-20 10:53:28,207 ERROR [Thread-42]: exec.Task
(SessionState.java:printError(861)) - Failed to execute spark task,
with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed
to create spark client.)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
    at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
    at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
Caused by: java.lang.RuntimeException:
java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at com.google.common.base.Throwables.propagate(Throwables.java:156)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
    at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
    at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
    ... 6 more
Caused by: java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
    ... 10 more
Caused by: java.util.concurrent.TimeoutException: Timed out waiting
for client connection.
    at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
    at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
    at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at java.lang.Thread.run(Thread.java:744)

2015-03-20 10:53:28,207 ERROR [Thread-42]: exec.Task
(SparkTask.java:execute(124)) - Failed to execute spark task, with
exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
create spark client.)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
    at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
    at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
Caused by: java.lang.RuntimeException:
java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at com.google.common.base.Throwables.propagate(Throwables.java:156)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
    at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
    at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
    ... 6 more
Caused by: java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
    ... 10 more
Caused by: java.util.concurrent.TimeoutException: Timed out waiting
for client connection.
    at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
    at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
    at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at java.lang.Thread.run(Thread.java:744)









Thanks & Regards
Amithsha


On Mon, Mar 16, 2015 at 9:32 PM, Jimmy Xiang <jx...@cloudera.com> wrote:
> One more thing, "java.lang.NoSuchFieldError:
> SPARK_RPC_CLIENT_CONNECT_TIMEOUT", are your jar files consistent?
>
> On Mon, Mar 16, 2015 at 6:47 AM, Xuefu Zhang <xz...@cloudera.com> wrote:
>>
>> It seems that your remote driver failed to start. I suggest #1: try
>> spark.master=local first; #2: check spark.log to find out why the remote
>> driver fails.
>>
>> --Xuefu
>>
>> On Sun, Mar 15, 2015 at 10:17 PM, Amith sha <am...@gmail.com> wrote:
>>>
>>> Hi,
>>>
>>> I have already added the spark-assembly jar in hive lib & here is my hive
>>> log
>>>
>>>
>>> 2015-03-16 10:40:08,299 INFO  [main]: SessionState
>>> (SessionState.java:printInfo(852)) - Added
>>>
>>> [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
>>> to class path
>>> 2015-03-16 10:40:08,300 INFO  [main]: SessionState
>>> (SessionState.java:printInfo(852)) - Added resources:
>>>
>>> [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
>>> 2015-03-16 10:40:36,914 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:36,916 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:36,916 INFO  [main]: parse.ParseDriver
>>> (ParseDriver.java:parse(185)) - Parsing command: insert into table
>>> test values(5,8900)
>>> 2015-03-16 10:40:36,917 INFO  [main]: parse.ParseDriver
>>> (ParseDriver.java:parse(206)) - Parse Completed
>>> 2015-03-16 10:40:36,925 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
>>> start=1426482636916 end=1426482636925 duration=9
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:36,929 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=semanticAnalyze
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:37,034 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:analyzeInternal(10146)) - Starting Semantic
>>> Analysis
>>> 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:genResolvedParseTree(10129)) - Completed phase
>>> 1 of Semantic Analysis
>>> 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:getMetaData(1434)) - Get metadata for source
>>> tables
>>> 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:getMetaData(1582)) - Get metadata for
>>> subqueries
>>> 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:getMetaData(1606)) - Get metadata for
>>> destination tables
>>> 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]:
>>> metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 2:
>>> source:10.10.10.25 get_table : db=test tbl=test
>>> 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]: HiveMetaStore.audit
>>> (HiveMetaStore.java:logAuditEvent(358)) - ugi=hadoop2
>>> ip=10.10.10.25    cmd=source:10.10.10.25 get_table : db=test tbl=test
>>> 2015-03-16 10:40:37,316 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:genResolvedParseTree(10133)) - Completed
>>> getting MetaData in Semantic Analysis
>>> 2015-03-16 10:40:37,318 INFO  [main]: parse.BaseSemanticAnalyzer
>>> (CalcitePlanner.java:canHandleAstForCbo(349)) - Not invoking CBO
>>> because the statement has too few joins
>>> 2015-03-16 10:40:37,320 INFO  [main]: common.FileUtils
>>> (FileUtils.java:mkdir(501)) - Creating directory if it doesn't exist:
>>>
>>> hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1
>>> 2015-03-16 10:40:37,429 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:genFileSinkPlan(6474)) - Set stats collection
>>> dir :
>>> hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1/-ext-10001
>>> 2015-03-16 10:40:37,450 INFO  [main]: ppd.OpProcFactory
>>> (OpProcFactory.java:process(657)) - Processing for FS(3)
>>> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
>>> (OpProcFactory.java:process(657)) - Processing for SEL(2)
>>> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
>>> (OpProcFactory.java:process(657)) - Processing for SEL(1)
>>> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
>>> (OpProcFactory.java:process(384)) - Processing for TS(0)
>>> 2015-03-16 10:40:37,507 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>>> method=partition-retrieving
>>> from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
>>> 2015-03-16 10:40:37,510 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>> method=partition-retrieving start=1426482637507 end=1426482637510
>>> duration=3 from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
>>> 2015-03-16 10:40:37,583 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>>> method=SparkOptimizeOperatorTree
>>> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>>> 2015-03-16 10:40:37,638 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>> method=SparkOptimizeOperatorTree start=1426482637583 end=1426482637638
>>> duration=55 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>>> 2015-03-16 10:40:37,660 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>>> method=SparkGenerateTaskTree
>>> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>>> 2015-03-16 10:40:37,711 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>> method=SparkGenerateTaskTree start=1426482637640 end=1426482637711
>>> duration=71 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>>> 2015-03-16 10:40:37,715 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>>> method=SparkOptimizeTaskTree
>>> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>>> 2015-03-16 10:40:37,737 INFO  [main]: physical.NullScanTaskDispatcher
>>> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
>>> where optimization is applicable
>>> 2015-03-16 10:40:37,741 INFO  [main]: physical.NullScanTaskDispatcher
>>> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
>>> 2015-03-16 10:40:37,742 INFO  [main]: physical.NullScanTaskDispatcher
>>> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
>>> where optimization is applicable
>>> 2015-03-16 10:40:37,744 INFO  [main]: physical.NullScanTaskDispatcher
>>> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
>>> 2015-03-16 10:40:37,747 INFO  [main]: physical.NullScanTaskDispatcher
>>> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
>>> where optimization is applicable
>>> 2015-03-16 10:40:37,754 INFO  [main]: physical.NullScanTaskDispatcher
>>> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
>>> 2015-03-16 10:40:37,756 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>> method=SparkOptimizeTaskTree start=1426482637715 end=1426482637756
>>> duration=41 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>>> 2015-03-16 10:40:37,762 INFO  [main]: parse.CalcitePlanner
>>> (SemanticAnalyzer.java:analyzeInternal(10231)) - Completed plan
>>> generation
>>> 2015-03-16 10:40:37,762 INFO  [main]: ql.Driver
>>> (Driver.java:compile(433)) - Semantic Analysis Completed
>>> 2015-03-16 10:40:37,762 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=semanticAnalyze
>>> start=1426482636929 end=1426482637762 duration=833
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:37,763 INFO  [main]: ql.Driver
>>> (Driver.java:getSchema(239)) - Returning Hive schema:
>>> Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null),
>>> FieldSchema(name:_col1, type:int, comment:null)], properties:null)
>>> 2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=compile
>>> start=1426482636915 end=1426482637765 duration=850
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>>> method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>> method=acquireReadWriteLocks start=1426482637765 end=1426482638193
>>> duration=428 from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.execute
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:38,193 INFO  [main]: ql.Driver
>>> (Driver.java:execute(1317)) - Starting command: insert into table test
>>> values(5,8900)
>>> 2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
>>> (SessionState.java:printInfo(852)) - Query ID =
>>> hadoop2_20150316104040_c19975af-9dc4-4af2-bfab-d44619224679
>>> 2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
>>> (SessionState.java:printInfo(852)) - Total jobs = 1
>>> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=TimeToSubmit
>>> start=1426482636914 end=1426482638194 duration=1280
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=runTasks
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>>> method=task.SPARK.Stage-1 from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
>>> (SessionState.java:printInfo(852)) - Launching Job 1 out of 1
>>> 2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
>>> (Driver.java:launchTask(1630)) - Starting task [Stage-1:MAPRED] in
>>> parallel
>>> 2015-03-16 10:40:38,200 INFO  [Thread-49]: hive.metastore
>>> (HiveMetaStoreClient.java:open(365)) - Trying to connect to metastore
>>> with URI thrift://nn01:7099
>>> 2015-03-16 10:40:38,208 INFO  [Thread-49]: hive.metastore
>>> (HiveMetaStoreClient.java:open(461)) - Connected to metastore.
>>> 2015-03-16 10:40:38,233 INFO  [Thread-49]: session.SessionState
>>> (SessionState.java:start(488)) - No Tez session required at this
>>> point. hive.execution.engine=mr.
>>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>>> (SessionState.java:printInfo(852)) - In order to change the average
>>> load for a reducer (in bytes):
>>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>>> (SessionState.java:printInfo(852)) -   set
>>> hive.exec.reducers.bytes.per.reducer=<number>
>>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>>> (SessionState.java:printInfo(852)) - In order to limit the maximum
>>> number of reducers:
>>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>>> (SessionState.java:printInfo(852)) -   set
>>> hive.exec.reducers.max=<number>
>>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>>> (SessionState.java:printInfo(852)) - In order to set a constant number
>>> of reducers:
>>> 2015-03-16 10:40:38,235 INFO  [Thread-49]: exec.Task
>>> (SessionState.java:printInfo(852)) -   set
>>> mapreduce.job.reduces=<number>
>>> 2015-03-16 10:40:38,243 INFO  [Thread-49]:
>>> session.SparkSessionManagerImpl
>>> (SparkSessionManagerImpl.java:setup(82)) - Setting up the session
>>> manager.
>>> 2015-03-16 10:40:38,502 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.connect.timeout ->
>>> 1000).
>>> 2015-03-16 10:40:38,504 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.serializer ->
>>> org.apache.spark.serializer.KryoSerializer).
>>> 2015-03-16 10:40:38,505 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.eventLog.enabled -> true).
>>> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.rpc.threads -> 8).
>>> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.secret.bits ->
>>> 256).
>>> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.home -> /opt/spark-1.2.1/).
>>> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.rpc.max.size ->
>>> 52428800).
>>> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.master ->
>>> spark://10.10.10.25:7077).
>>> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.executor.memory -> 512m).
>>> 2015-03-16 10:40:38,510 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration
>>> (hive.spark.client.server.connect.timeout -> 90000).
>>> 2015-03-16 10:40:39,100 WARN  [Thread-49]: rpc.RpcConfiguration
>>> (RpcConfiguration.java:getServerAddress(123)) - Your hostname, nn01,
>>> resolves to a loopback address, but we couldn't find  any external IP
>>> address!
>>> 2015-03-16 10:40:39,101 WARN  [Thread-49]: rpc.RpcConfiguration
>>> (RpcConfiguration.java:getServerAddress(125)) - Set
>>> hive.spark.client.server.address if you need to bind to another
>>> address.
>>> 2015-03-16 10:40:39,113 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.connect.timeout ->
>>> 1000).
>>> 2015-03-16 10:40:39,115 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.serializer ->
>>> org.apache.spark.serializer.KryoSerializer).
>>> 2015-03-16 10:40:39,116 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.eventLog.enabled -> true).
>>> 2015-03-16 10:40:39,116 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.rpc.threads -> 8).
>>> 2015-03-16 10:40:39,117 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.secret.bits ->
>>> 256).
>>> 2015-03-16 10:40:39,118 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.home -> /opt/spark-1.2.1/).
>>> 2015-03-16 10:40:39,119 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration (hive.spark.client.rpc.max.size ->
>>> 52428800).
>>> 2015-03-16 10:40:39,120 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.master ->
>>> spark://10.10.10.25:7077).
>>> 2015-03-16 10:40:39,121 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>>> property from hive configuration (spark.executor.memory -> 512m).
>>> 2015-03-16 10:40:39,122 INFO  [Thread-49]:
>>> spark.HiveSparkClientFactory
>>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>>> property from hive configuration
>>> (hive.spark.client.server.connect.timeout -> 90000).
>>> 2015-03-16 10:40:43,081 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - Spark assembly has been built with
>>> Hive, including Datanucleus jars on classpath
>>> 2015-03-16 10:40:47,867 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>>> property: hive.spark.client.connect.timeout=1000
>>> 2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>>> property: hive.spark.client.rpc.threads=8
>>> 2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>>> property: hive.spark.client.rpc.max.size=52428800
>>> 2015-03-16 10:40:47,870 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>>> property: hive.spark.client.secret.bits=256
>>> 2015-03-16 10:40:47,872 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>>> property: hive.spark.client.server.connect.timeout=90000
>>> 2015-03-16 10:40:48,329 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - 15/03/16 10:40:48 INFO
>>> client.RemoteDriver: Connecting to: nn01:53098
>>> 2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) - Exception in thread "main"
>>> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>>> 2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>>
>>> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>>> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:137)
>>> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:528)
>>> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>> java.lang.reflect.Method.invoke(Method.java:606)
>>> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(530)) -     at
>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> 2015-03-16 10:40:49,055 WARN  [Driver]: client.SparkClientImpl
>>> (SparkClientImpl.java:run(388)) - Child process exited with code 1.
>>> 2015-03-16 10:42:10,399 WARN  [Thread-49]: client.SparkClientImpl
>>> (SparkClientImpl.java:<init>(96)) - Error while waiting for client to
>>> connect.
>>> java.util.concurrent.ExecutionException:
>>> java.util.concurrent.TimeoutException: Timed out waiting for client
>>> connection.
>>>     at
>>> io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>>>     at
>>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>>>     at
>>> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>>>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
>>> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
>>> for client connection.
>>>     at
>>> org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>>>     at
>>> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>>>     at
>>> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>>>     at
>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>>>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>     at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>>>     at java.lang.Thread.run(Thread.java:744)
>>> 2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
>>> (SessionState.java:printError(861)) - Failed to execute spark task,
>>> with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed
>>> to create spark client.)'
>>> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
>>> client.
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>>>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
>>> Caused by: java.lang.RuntimeException:
>>> java.util.concurrent.ExecutionException:
>>> java.util.concurrent.TimeoutException: Timed out waiting for client
>>> connection.
>>>     at com.google.common.base.Throwables.propagate(Throwables.java:156)
>>>     at
>>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>>>     at
>>> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>>>     ... 6 more
>>> Caused by: java.util.concurrent.ExecutionException:
>>> java.util.concurrent.TimeoutException: Timed out waiting for client
>>> connection.
>>>     at
>>> io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>>>     at
>>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>>>     ... 10 more
>>> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
>>> for client connection.
>>>     at
>>> org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>>>     at
>>> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>>>     at
>>> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>>>     at
>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>>>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>     at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>>>     at java.lang.Thread.run(Thread.java:744)
>>>
>>> 2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
>>> (SparkTask.java:execute(124)) - Failed to execute spark task, with
>>> exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
>>> create spark client.)'
>>> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
>>> client.
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>>>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
>>> Caused by: java.lang.RuntimeException:
>>> java.util.concurrent.ExecutionException:
>>> java.util.concurrent.TimeoutException: Timed out waiting for client
>>> connection.
>>>     at com.google.common.base.Throwables.propagate(Throwables.java:156)
>>>     at
>>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>>>     at
>>> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>>>     ... 6 more
>>> Caused by: java.util.concurrent.ExecutionException:
>>> java.util.concurrent.TimeoutException: Timed out waiting for client
>>> connection.
>>>     at
>>> io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>>>     at
>>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>>>     ... 10 more
>>> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
>>> for client connection.
>>>     at
>>> org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>>>     at
>>> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>>>     at
>>> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>>>     at
>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>>>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>     at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>>>     at java.lang.Thread.run(Thread.java:744)
>>> 2015-03-16 10:42:12,204 ERROR [main]: ql.Driver
>>> (SessionState.java:printError(861)) - FAILED: Execution Error, return
>>> code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
>>> 2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=Driver.execute
>>> start=1426482638193 end=1426482732205 duration=94012
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:42:12,544 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
>>> start=1426482732205 end=1426482732544 duration=339
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
>>> start=1426482732583 end=1426482732583 duration=0
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:44:30,940 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2015-03-16 10:44:30,941 INFO  [main]: parse.ParseDriver
>>> (ParseDriver.java:parse(185)) - Parsing command: insert into table
>>> test values(5,8900)
>>> 2015-03-16 10:44:30,942 INFO  [main]: parse.ParseDriver
>>> (ParseDriver.java:parse(206)) - Parse Completed
>>> 2015-03-16 10:44:30,942 INFO  [main]: log.PerfLogger
>>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
>>> start=1426482870940 end=1426482870942 duration=2
>>> from=org.apache.hadoop.hive.ql.Driver>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> Thanks & Regards
>>> Amithsha
>>>
>>>
>>> On Fri, Mar 13, 2015 at 7:36 PM, Xuefu Zhang <xz...@cloudera.com> wrote:
>>> > You need to copy the spark-assembly.jar to your hive/lib.
>>> >
>>> > Also, you can check hive.log to get more messages.
>>> >
>>> > On Fri, Mar 13, 2015 at 4:51 AM, Amith sha <am...@gmail.com>
>>> > wrote:
>>> >>
>>> >> Hi all,
>>> >>
>>> >>
>>> >> Recently i have configured Spark 1.2.0 and my environment is hadoop
>>> >> 2.6.0 hive 1.1.0 Here i have tried hive on Spark while executing
>>> >> insert into i am getting the following g error.
>>> >>
>>> >> Query ID = hadoop2_20150313162828_8764adad-a8e4-49da-9ef5-35e4ebd6bc63
>>> >> Total jobs = 1
>>> >> Launching Job 1 out of 1
>>> >> In order to change the average load for a reducer (in bytes):
>>> >>   set hive.exec.reducers.bytes.per.reducer=<number>
>>> >> In order to limit the maximum number of reducers:
>>> >>   set hive.exec.reducers.max=<number>
>>> >> In order to set a constant number of reducers:
>>> >>   set mapreduce.job.reduces=<number>
>>> >> Failed to execute spark task, with exception
>>> >> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create
>>> >> spark client.)'
>>> >> FAILED: Execution Error, return code 1 from
>>> >> org.apache.hadoop.hive.ql.exec.spark.SparkTask
>>> >>
>>> >>
>>> >>
>>> >> Have added the spark-assembly jar in hive lib
>>> >> And also in hive console using the command add jar followed by the
>>> >> steps
>>> >>
>>> >> set spark.home=/opt/spark-1.2.1/;
>>> >>
>>> >>
>>> >> add jar
>>> >>
>>> >> /opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar;
>>> >>
>>> >>
>>> >>
>>> >> set hive.execution.engine=spark;
>>> >>
>>> >>
>>> >> set spark.master=spark://xxxxxxx:7077;
>>> >>
>>> >>
>>> >> set spark.eventLog.enabled=true;
>>> >>
>>> >>
>>> >> set spark.executor.memory=512m;
>>> >>
>>> >>
>>> >> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
>>> >>
>>> >> Can anyone suggest!!!!
>>> >>
>>> >>
>>> >>
>>> >> Thanks & Regards
>>> >> Amithsha
>>> >
>>> >
>>
>>
>

Re: Hive on Spark

Posted by Jimmy Xiang <jx...@cloudera.com>.
One more thing, "java.lang.NoSuchFieldError:
SPARK_RPC_CLIENT_CONNECT_TIMEOUT", are your jar files consistent?

On Mon, Mar 16, 2015 at 6:47 AM, Xuefu Zhang <xz...@cloudera.com> wrote:

> It seems that your remote driver failed to start. I suggest #1: try
> spark.master=local first; #2: check spark.log to find out why the remote
> driver fails.
>
> --Xuefu
>
> On Sun, Mar 15, 2015 at 10:17 PM, Amith sha <am...@gmail.com> wrote:
>
>> Hi,
>>
>> I have already added the spark-assembly jar in hive lib & here is my hive
>> log
>>
>>
>> 2015-03-16 10:40:08,299 INFO  [main]: SessionState
>> (SessionState.java:printInfo(852)) - Added
>>
>> [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
>> to class path
>> 2015-03-16 10:40:08,300 INFO  [main]: SessionState
>> (SessionState.java:printInfo(852)) - Added resources:
>>
>> [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
>> 2015-03-16 10:40:36,914 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:36,916 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:36,916 INFO  [main]: parse.ParseDriver
>> (ParseDriver.java:parse(185)) - Parsing command: insert into table
>> test values(5,8900)
>> 2015-03-16 10:40:36,917 INFO  [main]: parse.ParseDriver
>> (ParseDriver.java:parse(206)) - Parse Completed
>> 2015-03-16 10:40:36,925 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
>> start=1426482636916 end=1426482636925 duration=9
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:36,929 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=semanticAnalyze
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:37,034 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:analyzeInternal(10146)) - Starting Semantic
>> Analysis
>> 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:genResolvedParseTree(10129)) - Completed phase
>> 1 of Semantic Analysis
>> 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:getMetaData(1434)) - Get metadata for source
>> tables
>> 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:getMetaData(1582)) - Get metadata for
>> subqueries
>> 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:getMetaData(1606)) - Get metadata for
>> destination tables
>> 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]:
>> metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 2:
>> source:10.10.10.25 get_table : db=test tbl=test
>> 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]: HiveMetaStore.audit
>> (HiveMetaStore.java:logAuditEvent(358)) - ugi=hadoop2
>> ip=10.10.10.25    cmd=source:10.10.10.25 get_table : db=test tbl=test
>> 2015-03-16 10:40:37,316 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:genResolvedParseTree(10133)) - Completed
>> getting MetaData in Semantic Analysis
>> 2015-03-16 10:40:37,318 INFO  [main]: parse.BaseSemanticAnalyzer
>> (CalcitePlanner.java:canHandleAstForCbo(349)) - Not invoking CBO
>> because the statement has too few joins
>> 2015-03-16 10:40:37,320 INFO  [main]: common.FileUtils
>> (FileUtils.java:mkdir(501)) - Creating directory if it doesn't exist:
>>
>> hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1
>> 2015-03-16 10:40:37,429 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:genFileSinkPlan(6474)) - Set stats collection
>> dir :
>> hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1/-ext-10001
>> 2015-03-16 10:40:37,450 INFO  [main]: ppd.OpProcFactory
>> (OpProcFactory.java:process(657)) - Processing for FS(3)
>> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
>> (OpProcFactory.java:process(657)) - Processing for SEL(2)
>> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
>> (OpProcFactory.java:process(657)) - Processing for SEL(1)
>> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
>> (OpProcFactory.java:process(384)) - Processing for TS(0)
>> 2015-03-16 10:40:37,507 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>> method=partition-retrieving
>> from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
>> 2015-03-16 10:40:37,510 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>> method=partition-retrieving start=1426482637507 end=1426482637510
>> duration=3 from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
>> 2015-03-16 10:40:37,583 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>> method=SparkOptimizeOperatorTree
>> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>> 2015-03-16 10:40:37,638 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>> method=SparkOptimizeOperatorTree start=1426482637583 end=1426482637638
>> duration=55 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>> 2015-03-16 10:40:37,660 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>> method=SparkGenerateTaskTree
>> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>> 2015-03-16 10:40:37,711 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>> method=SparkGenerateTaskTree start=1426482637640 end=1426482637711
>> duration=71 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>> 2015-03-16 10:40:37,715 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>> method=SparkOptimizeTaskTree
>> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>> 2015-03-16 10:40:37,737 INFO  [main]: physical.NullScanTaskDispatcher
>> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
>> where optimization is applicable
>> 2015-03-16 10:40:37,741 INFO  [main]: physical.NullScanTaskDispatcher
>> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
>> 2015-03-16 10:40:37,742 INFO  [main]: physical.NullScanTaskDispatcher
>> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
>> where optimization is applicable
>> 2015-03-16 10:40:37,744 INFO  [main]: physical.NullScanTaskDispatcher
>> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
>> 2015-03-16 10:40:37,747 INFO  [main]: physical.NullScanTaskDispatcher
>> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
>> where optimization is applicable
>> 2015-03-16 10:40:37,754 INFO  [main]: physical.NullScanTaskDispatcher
>> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
>> 2015-03-16 10:40:37,756 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>> method=SparkOptimizeTaskTree start=1426482637715 end=1426482637756
>> duration=41 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
>> 2015-03-16 10:40:37,762 INFO  [main]: parse.CalcitePlanner
>> (SemanticAnalyzer.java:analyzeInternal(10231)) - Completed plan
>> generation
>> 2015-03-16 10:40:37,762 INFO  [main]: ql.Driver
>> (Driver.java:compile(433)) - Semantic Analysis Completed
>> 2015-03-16 10:40:37,762 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=semanticAnalyze
>> start=1426482636929 end=1426482637762 duration=833
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:37,763 INFO  [main]: ql.Driver
>> (Driver.java:getSchema(239)) - Returning Hive schema:
>> Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null),
>> FieldSchema(name:_col1, type:int, comment:null)], properties:null)
>> 2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=compile
>> start=1426482636915 end=1426482637765 duration=850
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>> method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>> method=acquireReadWriteLocks start=1426482637765 end=1426482638193
>> duration=428 from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.execute
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:38,193 INFO  [main]: ql.Driver
>> (Driver.java:execute(1317)) - Starting command: insert into table test
>> values(5,8900)
>> 2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
>> (SessionState.java:printInfo(852)) - Query ID =
>> hadoop2_20150316104040_c19975af-9dc4-4af2-bfab-d44619224679
>> 2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
>> (SessionState.java:printInfo(852)) - Total jobs = 1
>> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=TimeToSubmit
>> start=1426482636914 end=1426482638194 duration=1280
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=runTasks
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
>> method=task.SPARK.Stage-1 from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
>> (SessionState.java:printInfo(852)) - Launching Job 1 out of 1
>> 2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
>> (Driver.java:launchTask(1630)) - Starting task [Stage-1:MAPRED] in
>> parallel
>> 2015-03-16 10:40:38,200 INFO  [Thread-49]: hive.metastore
>> (HiveMetaStoreClient.java:open(365)) - Trying to connect to metastore
>> with URI thrift://nn01:7099
>> 2015-03-16 10:40:38,208 INFO  [Thread-49]: hive.metastore
>> (HiveMetaStoreClient.java:open(461)) - Connected to metastore.
>> 2015-03-16 10:40:38,233 INFO  [Thread-49]: session.SessionState
>> (SessionState.java:start(488)) - No Tez session required at this
>> point. hive.execution.engine=mr.
>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>> (SessionState.java:printInfo(852)) - In order to change the average
>> load for a reducer (in bytes):
>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>> (SessionState.java:printInfo(852)) -   set
>> hive.exec.reducers.bytes.per.reducer=<number>
>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>> (SessionState.java:printInfo(852)) - In order to limit the maximum
>> number of reducers:
>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>> (SessionState.java:printInfo(852)) -   set
>> hive.exec.reducers.max=<number>
>> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
>> (SessionState.java:printInfo(852)) - In order to set a constant number
>> of reducers:
>> 2015-03-16 10:40:38,235 INFO  [Thread-49]: exec.Task
>> (SessionState.java:printInfo(852)) -   set
>> mapreduce.job.reduces=<number>
>> 2015-03-16 10:40:38,243 INFO  [Thread-49]:
>> session.SparkSessionManagerImpl
>> (SparkSessionManagerImpl.java:setup(82)) - Setting up the session
>> manager.
>> 2015-03-16 10:40:38,502 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.connect.timeout ->
>> 1000).
>> 2015-03-16 10:40:38,504 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.serializer ->
>> org.apache.spark.serializer.KryoSerializer).
>> 2015-03-16 10:40:38,505 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.eventLog.enabled -> true).
>> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.rpc.threads -> 8).
>> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.secret.bits ->
>> 256).
>> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.home -> /opt/spark-1.2.1/).
>> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.rpc.max.size ->
>> 52428800).
>> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.master ->
>> spark://10.10.10.25:7077).
>> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.executor.memory -> 512m).
>> 2015-03-16 10:40:38,510 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration
>> (hive.spark.client.server.connect.timeout -> 90000).
>> 2015-03-16 10:40:39,100 WARN  [Thread-49]: rpc.RpcConfiguration
>> (RpcConfiguration.java:getServerAddress(123)) - Your hostname, nn01,
>> resolves to a loopback address, but we couldn't find  any external IP
>> address!
>> 2015-03-16 10:40:39,101 WARN  [Thread-49]: rpc.RpcConfiguration
>> (RpcConfiguration.java:getServerAddress(125)) - Set
>> hive.spark.client.server.address if you need to bind to another
>> address.
>> 2015-03-16 10:40:39,113 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.connect.timeout ->
>> 1000).
>> 2015-03-16 10:40:39,115 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.serializer ->
>> org.apache.spark.serializer.KryoSerializer).
>> 2015-03-16 10:40:39,116 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.eventLog.enabled -> true).
>> 2015-03-16 10:40:39,116 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.rpc.threads -> 8).
>> 2015-03-16 10:40:39,117 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.secret.bits ->
>> 256).
>> 2015-03-16 10:40:39,118 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.home -> /opt/spark-1.2.1/).
>> 2015-03-16 10:40:39,119 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration (hive.spark.client.rpc.max.size ->
>> 52428800).
>> 2015-03-16 10:40:39,120 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.master ->
>> spark://10.10.10.25:7077).
>> 2015-03-16 10:40:39,121 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
>> property from hive configuration (spark.executor.memory -> 512m).
>> 2015-03-16 10:40:39,122 INFO  [Thread-49]:
>> spark.HiveSparkClientFactory
>> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
>> property from hive configuration
>> (hive.spark.client.server.connect.timeout -> 90000).
>> 2015-03-16 10:40:43,081 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - Spark assembly has been built with
>> Hive, including Datanucleus jars on classpath
>> 2015-03-16 10:40:47,867 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.connect.timeout=1000
>> 2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.rpc.threads=8
>> 2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.rpc.max.size=52428800
>> 2015-03-16 10:40:47,870 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.secret.bits=256
>> 2015-03-16 10:40:47,872 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.server.connect.timeout=90000
>> 2015-03-16 10:40:48,329 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - 15/03/16 10:40:48 INFO
>> client.RemoteDriver: Connecting to: nn01:53098
>> 2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) - Exception in thread "main"
>> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
>> 2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>>
>> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
>> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:137)
>> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:528)
>> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>> java.lang.reflect.Method.invoke(Method.java:606)
>> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(530)) -     at
>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> 2015-03-16 10:40:49,055 WARN  [Driver]: client.SparkClientImpl
>> (SparkClientImpl.java:run(388)) - Child process exited with code 1.
>> 2015-03-16 10:42:10,399 WARN  [Thread-49]: client.SparkClientImpl
>> (SparkClientImpl.java:<init>(96)) - Error while waiting for client to
>> connect.
>> java.util.concurrent.ExecutionException:
>> java.util.concurrent.TimeoutException: Timed out waiting for client
>> connection.
>>     at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>>     at
>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>>     at
>> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>>     at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
>> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
>> for client connection.
>>     at
>> org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>>     at
>> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>>     at
>> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>>     at
>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>     at
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>>     at java.lang.Thread.run(Thread.java:744)
>> 2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
>> (SessionState.java:printError(861)) - Failed to execute spark task,
>> with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed
>> to create spark client.)'
>> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
>> client.
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>>     at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
>> Caused by: java.lang.RuntimeException:
>> java.util.concurrent.ExecutionException:
>> java.util.concurrent.TimeoutException: Timed out waiting for client
>> connection.
>>     at com.google.common.base.Throwables.propagate(Throwables.java:156)
>>     at
>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>>     at
>> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>>     ... 6 more
>> Caused by: java.util.concurrent.ExecutionException:
>> java.util.concurrent.TimeoutException: Timed out waiting for client
>> connection.
>>     at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>>     at
>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>>     ... 10 more
>> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
>> for client connection.
>>     at
>> org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>>     at
>> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>>     at
>> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>>     at
>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>     at
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>>     at java.lang.Thread.run(Thread.java:744)
>>
>> 2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
>> (SparkTask.java:execute(124)) - Failed to execute spark task, with
>> exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
>> create spark client.)'
>> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
>> client.
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>>     at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
>> Caused by: java.lang.RuntimeException:
>> java.util.concurrent.ExecutionException:
>> java.util.concurrent.TimeoutException: Timed out waiting for client
>> connection.
>>     at com.google.common.base.Throwables.propagate(Throwables.java:156)
>>     at
>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>>     at
>> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>>     at
>> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>>     ... 6 more
>> Caused by: java.util.concurrent.ExecutionException:
>> java.util.concurrent.TimeoutException: Timed out waiting for client
>> connection.
>>     at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>>     at
>> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>>     ... 10 more
>> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
>> for client connection.
>>     at
>> org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>>     at
>> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>>     at
>> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>>     at
>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>     at
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>>     at java.lang.Thread.run(Thread.java:744)
>> 2015-03-16 10:42:12,204 ERROR [main]: ql.Driver
>> (SessionState.java:printError(861)) - FAILED: Execution Error, return
>> code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
>> 2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=Driver.execute
>> start=1426482638193 end=1426482732205 duration=94012
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:42:12,544 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
>> start=1426482732205 end=1426482732544 duration=339
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
>> start=1426482732583 end=1426482732583 duration=0
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:44:30,940 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2015-03-16 10:44:30,941 INFO  [main]: parse.ParseDriver
>> (ParseDriver.java:parse(185)) - Parsing command: insert into table
>> test values(5,8900)
>> 2015-03-16 10:44:30,942 INFO  [main]: parse.ParseDriver
>> (ParseDriver.java:parse(206)) - Parse Completed
>> 2015-03-16 10:44:30,942 INFO  [main]: log.PerfLogger
>> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
>> start=1426482870940 end=1426482870942 duration=2
>> from=org.apache.hadoop.hive.ql.Driver>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> Thanks & Regards
>> Amithsha
>>
>>
>> On Fri, Mar 13, 2015 at 7:36 PM, Xuefu Zhang <xz...@cloudera.com> wrote:
>> > You need to copy the spark-assembly.jar to your hive/lib.
>> >
>> > Also, you can check hive.log to get more messages.
>> >
>> > On Fri, Mar 13, 2015 at 4:51 AM, Amith sha <am...@gmail.com>
>> wrote:
>> >>
>> >> Hi all,
>> >>
>> >>
>> >> Recently i have configured Spark 1.2.0 and my environment is hadoop
>> >> 2.6.0 hive 1.1.0 Here i have tried hive on Spark while executing
>> >> insert into i am getting the following g error.
>> >>
>> >> Query ID = hadoop2_20150313162828_8764adad-a8e4-49da-9ef5-35e4ebd6bc63
>> >> Total jobs = 1
>> >> Launching Job 1 out of 1
>> >> In order to change the average load for a reducer (in bytes):
>> >>   set hive.exec.reducers.bytes.per.reducer=<number>
>> >> In order to limit the maximum number of reducers:
>> >>   set hive.exec.reducers.max=<number>
>> >> In order to set a constant number of reducers:
>> >>   set mapreduce.job.reduces=<number>
>> >> Failed to execute spark task, with exception
>> >> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create
>> >> spark client.)'
>> >> FAILED: Execution Error, return code 1 from
>> >> org.apache.hadoop.hive.ql.exec.spark.SparkTask
>> >>
>> >>
>> >>
>> >> Have added the spark-assembly jar in hive lib
>> >> And also in hive console using the command add jar followed by the
>> steps
>> >>
>> >> set spark.home=/opt/spark-1.2.1/;
>> >>
>> >>
>> >> add jar
>> >>
>> /opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar;
>> >>
>> >>
>> >>
>> >> set hive.execution.engine=spark;
>> >>
>> >>
>> >> set spark.master=spark://xxxxxxx:7077;
>> >>
>> >>
>> >> set spark.eventLog.enabled=true;
>> >>
>> >>
>> >> set spark.executor.memory=512m;
>> >>
>> >>
>> >> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
>> >>
>> >> Can anyone suggest!!!!
>> >>
>> >>
>> >>
>> >> Thanks & Regards
>> >> Amithsha
>> >
>> >
>>
>
>

Re: Hive on Spark

Posted by Xuefu Zhang <xz...@cloudera.com>.
It seems that your remote driver failed to start. I suggest #1: try
spark.master=local first; #2: check spark.log to find out why the remote
driver fails.

--Xuefu

On Sun, Mar 15, 2015 at 10:17 PM, Amith sha <am...@gmail.com> wrote:

> Hi,
>
> I have already added the spark-assembly jar in hive lib & here is my hive
> log
>
>
> 2015-03-16 10:40:08,299 INFO  [main]: SessionState
> (SessionState.java:printInfo(852)) - Added
>
> [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
> to class path
> 2015-03-16 10:40:08,300 INFO  [main]: SessionState
> (SessionState.java:printInfo(852)) - Added resources:
>
> [/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
> 2015-03-16 10:40:36,914 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:36,916 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:36,916 INFO  [main]: parse.ParseDriver
> (ParseDriver.java:parse(185)) - Parsing command: insert into table
> test values(5,8900)
> 2015-03-16 10:40:36,917 INFO  [main]: parse.ParseDriver
> (ParseDriver.java:parse(206)) - Parse Completed
> 2015-03-16 10:40:36,925 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
> start=1426482636916 end=1426482636925 duration=9
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:36,929 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=semanticAnalyze
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:37,034 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:analyzeInternal(10146)) - Starting Semantic
> Analysis
> 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:genResolvedParseTree(10129)) - Completed phase
> 1 of Semantic Analysis
> 2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:getMetaData(1434)) - Get metadata for source
> tables
> 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:getMetaData(1582)) - Get metadata for
> subqueries
> 2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:getMetaData(1606)) - Get metadata for
> destination tables
> 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]:
> metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 2:
> source:10.10.10.25 get_table : db=test tbl=test
> 2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]: HiveMetaStore.audit
> (HiveMetaStore.java:logAuditEvent(358)) - ugi=hadoop2
> ip=10.10.10.25    cmd=source:10.10.10.25 get_table : db=test tbl=test
> 2015-03-16 10:40:37,316 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:genResolvedParseTree(10133)) - Completed
> getting MetaData in Semantic Analysis
> 2015-03-16 10:40:37,318 INFO  [main]: parse.BaseSemanticAnalyzer
> (CalcitePlanner.java:canHandleAstForCbo(349)) - Not invoking CBO
> because the statement has too few joins
> 2015-03-16 10:40:37,320 INFO  [main]: common.FileUtils
> (FileUtils.java:mkdir(501)) - Creating directory if it doesn't exist:
>
> hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1
> 2015-03-16 10:40:37,429 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:genFileSinkPlan(6474)) - Set stats collection
> dir :
> hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1/-ext-10001
> 2015-03-16 10:40:37,450 INFO  [main]: ppd.OpProcFactory
> (OpProcFactory.java:process(657)) - Processing for FS(3)
> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
> (OpProcFactory.java:process(657)) - Processing for SEL(2)
> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
> (OpProcFactory.java:process(657)) - Processing for SEL(1)
> 2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
> (OpProcFactory.java:process(384)) - Processing for TS(0)
> 2015-03-16 10:40:37,507 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
> method=partition-retrieving
> from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
> 2015-03-16 10:40:37,510 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
> method=partition-retrieving start=1426482637507 end=1426482637510
> duration=3 from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
> 2015-03-16 10:40:37,583 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
> method=SparkOptimizeOperatorTree
> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
> 2015-03-16 10:40:37,638 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
> method=SparkOptimizeOperatorTree start=1426482637583 end=1426482637638
> duration=55 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
> 2015-03-16 10:40:37,660 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
> method=SparkGenerateTaskTree
> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
> 2015-03-16 10:40:37,711 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
> method=SparkGenerateTaskTree start=1426482637640 end=1426482637711
> duration=71 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
> 2015-03-16 10:40:37,715 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
> method=SparkOptimizeTaskTree
> from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
> 2015-03-16 10:40:37,737 INFO  [main]: physical.NullScanTaskDispatcher
> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
> where optimization is applicable
> 2015-03-16 10:40:37,741 INFO  [main]: physical.NullScanTaskDispatcher
> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
> 2015-03-16 10:40:37,742 INFO  [main]: physical.NullScanTaskDispatcher
> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
> where optimization is applicable
> 2015-03-16 10:40:37,744 INFO  [main]: physical.NullScanTaskDispatcher
> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
> 2015-03-16 10:40:37,747 INFO  [main]: physical.NullScanTaskDispatcher
> (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
> where optimization is applicable
> 2015-03-16 10:40:37,754 INFO  [main]: physical.NullScanTaskDispatcher
> (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
> 2015-03-16 10:40:37,756 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
> method=SparkOptimizeTaskTree start=1426482637715 end=1426482637756
> duration=41 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
> 2015-03-16 10:40:37,762 INFO  [main]: parse.CalcitePlanner
> (SemanticAnalyzer.java:analyzeInternal(10231)) - Completed plan
> generation
> 2015-03-16 10:40:37,762 INFO  [main]: ql.Driver
> (Driver.java:compile(433)) - Semantic Analysis Completed
> 2015-03-16 10:40:37,762 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=semanticAnalyze
> start=1426482636929 end=1426482637762 duration=833
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:37,763 INFO  [main]: ql.Driver
> (Driver.java:getSchema(239)) - Returning Hive schema:
> Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null),
> FieldSchema(name:_col1, type:int, comment:null)], properties:null)
> 2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=compile
> start=1426482636915 end=1426482637765 duration=850
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
> method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
> method=acquireReadWriteLocks start=1426482637765 end=1426482638193
> duration=428 from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.execute
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:38,193 INFO  [main]: ql.Driver
> (Driver.java:execute(1317)) - Starting command: insert into table test
> values(5,8900)
> 2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
> (SessionState.java:printInfo(852)) - Query ID =
> hadoop2_20150316104040_c19975af-9dc4-4af2-bfab-d44619224679
> 2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
> (SessionState.java:printInfo(852)) - Total jobs = 1
> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=TimeToSubmit
> start=1426482636914 end=1426482638194 duration=1280
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=runTasks
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
> method=task.SPARK.Stage-1 from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
> (SessionState.java:printInfo(852)) - Launching Job 1 out of 1
> 2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
> (Driver.java:launchTask(1630)) - Starting task [Stage-1:MAPRED] in
> parallel
> 2015-03-16 10:40:38,200 INFO  [Thread-49]: hive.metastore
> (HiveMetaStoreClient.java:open(365)) - Trying to connect to metastore
> with URI thrift://nn01:7099
> 2015-03-16 10:40:38,208 INFO  [Thread-49]: hive.metastore
> (HiveMetaStoreClient.java:open(461)) - Connected to metastore.
> 2015-03-16 10:40:38,233 INFO  [Thread-49]: session.SessionState
> (SessionState.java:start(488)) - No Tez session required at this
> point. hive.execution.engine=mr.
> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
> (SessionState.java:printInfo(852)) - In order to change the average
> load for a reducer (in bytes):
> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
> (SessionState.java:printInfo(852)) -   set
> hive.exec.reducers.bytes.per.reducer=<number>
> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
> (SessionState.java:printInfo(852)) - In order to limit the maximum
> number of reducers:
> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
> (SessionState.java:printInfo(852)) -   set
> hive.exec.reducers.max=<number>
> 2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
> (SessionState.java:printInfo(852)) - In order to set a constant number
> of reducers:
> 2015-03-16 10:40:38,235 INFO  [Thread-49]: exec.Task
> (SessionState.java:printInfo(852)) -   set
> mapreduce.job.reduces=<number>
> 2015-03-16 10:40:38,243 INFO  [Thread-49]:
> session.SparkSessionManagerImpl
> (SparkSessionManagerImpl.java:setup(82)) - Setting up the session
> manager.
> 2015-03-16 10:40:38,502 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.connect.timeout ->
> 1000).
> 2015-03-16 10:40:38,504 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.serializer ->
> org.apache.spark.serializer.KryoSerializer).
> 2015-03-16 10:40:38,505 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.eventLog.enabled -> true).
> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.rpc.threads -> 8).
> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.secret.bits ->
> 256).
> 2015-03-16 10:40:38,508 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.home -> /opt/spark-1.2.1/).
> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.rpc.max.size ->
> 52428800).
> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.master ->
> spark://10.10.10.25:7077).
> 2015-03-16 10:40:38,509 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.executor.memory -> 512m).
> 2015-03-16 10:40:38,510 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration
> (hive.spark.client.server.connect.timeout -> 90000).
> 2015-03-16 10:40:39,100 WARN  [Thread-49]: rpc.RpcConfiguration
> (RpcConfiguration.java:getServerAddress(123)) - Your hostname, nn01,
> resolves to a loopback address, but we couldn't find  any external IP
> address!
> 2015-03-16 10:40:39,101 WARN  [Thread-49]: rpc.RpcConfiguration
> (RpcConfiguration.java:getServerAddress(125)) - Set
> hive.spark.client.server.address if you need to bind to another
> address.
> 2015-03-16 10:40:39,113 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.connect.timeout ->
> 1000).
> 2015-03-16 10:40:39,115 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.serializer ->
> org.apache.spark.serializer.KryoSerializer).
> 2015-03-16 10:40:39,116 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.eventLog.enabled -> true).
> 2015-03-16 10:40:39,116 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.rpc.threads -> 8).
> 2015-03-16 10:40:39,117 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.secret.bits ->
> 256).
> 2015-03-16 10:40:39,118 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.home -> /opt/spark-1.2.1/).
> 2015-03-16 10:40:39,119 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration (hive.spark.client.rpc.max.size ->
> 52428800).
> 2015-03-16 10:40:39,120 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.master ->
> spark://10.10.10.25:7077).
> 2015-03-16 10:40:39,121 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
> property from hive configuration (spark.executor.memory -> 512m).
> 2015-03-16 10:40:39,122 INFO  [Thread-49]:
> spark.HiveSparkClientFactory
> (HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
> property from hive configuration
> (hive.spark.client.server.connect.timeout -> 90000).
> 2015-03-16 10:40:43,081 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - Spark assembly has been built with
> Hive, including Datanucleus jars on classpath
> 2015-03-16 10:40:47,867 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
> property: hive.spark.client.connect.timeout=1000
> 2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
> property: hive.spark.client.rpc.threads=8
> 2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
> property: hive.spark.client.rpc.max.size=52428800
> 2015-03-16 10:40:47,870 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
> property: hive.spark.client.secret.bits=256
> 2015-03-16 10:40:47,872 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
> property: hive.spark.client.server.connect.timeout=90000
> 2015-03-16 10:40:48,329 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - 15/03/16 10:40:48 INFO
> client.RemoteDriver: Connecting to: nn01:53098
> 2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) - Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
> 2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
>
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:137)
> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:528)
> 2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
> java.lang.reflect.Method.invoke(Method.java:606)
> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> 2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(530)) -     at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 2015-03-16 10:40:49,055 WARN  [Driver]: client.SparkClientImpl
> (SparkClientImpl.java:run(388)) - Child process exited with code 1.
> 2015-03-16 10:42:10,399 WARN  [Thread-49]: client.SparkClientImpl
> (SparkClientImpl.java:<init>(96)) - Error while waiting for client to
> connect.
> java.util.concurrent.ExecutionException:
> java.util.concurrent.TimeoutException: Timed out waiting for client
> connection.
>     at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>     at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>     at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>     at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>     at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>     at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>     at
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>     at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
> for client connection.
>     at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>     at
> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>     at
> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>     at
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>     at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>     at java.lang.Thread.run(Thread.java:744)
> 2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
> (SessionState.java:printError(861)) - Failed to execute spark task,
> with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed
> to create spark client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
> client.
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>     at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>     at
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>     at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
> Caused by: java.lang.RuntimeException:
> java.util.concurrent.ExecutionException:
> java.util.concurrent.TimeoutException: Timed out waiting for client
> connection.
>     at com.google.common.base.Throwables.propagate(Throwables.java:156)
>     at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>     at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>     at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>     at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>     ... 6 more
> Caused by: java.util.concurrent.ExecutionException:
> java.util.concurrent.TimeoutException: Timed out waiting for client
> connection.
>     at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>     at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>     ... 10 more
> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
> for client connection.
>     at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>     at
> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>     at
> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>     at
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>     at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>     at java.lang.Thread.run(Thread.java:744)
>
> 2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
> (SparkTask.java:execute(124)) - Failed to execute spark task, with
> exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
> create spark client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
> client.
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
>     at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
>     at
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>     at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
>     at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
> Caused by: java.lang.RuntimeException:
> java.util.concurrent.ExecutionException:
> java.util.concurrent.TimeoutException: Timed out waiting for client
> connection.
>     at com.google.common.base.Throwables.propagate(Throwables.java:156)
>     at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>     at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>     at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
>     at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
>     at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>     ... 6 more
> Caused by: java.util.concurrent.ExecutionException:
> java.util.concurrent.TimeoutException: Timed out waiting for client
> connection.
>     at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>     at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
>     ... 10 more
> Caused by: java.util.concurrent.TimeoutException: Timed out waiting
> for client connection.
>     at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
>     at
> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>     at
> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
>     at
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>     at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>     at java.lang.Thread.run(Thread.java:744)
> 2015-03-16 10:42:12,204 ERROR [main]: ql.Driver
> (SessionState.java:printError(861)) - FAILED: Execution Error, return
> code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
> 2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=Driver.execute
> start=1426482638193 end=1426482732205 duration=94012
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:42:12,544 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
> start=1426482732205 end=1426482732544 duration=339
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
> start=1426482732583 end=1426482732583 duration=0
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:44:30,940 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
> from=org.apache.hadoop.hive.ql.Driver>
> 2015-03-16 10:44:30,941 INFO  [main]: parse.ParseDriver
> (ParseDriver.java:parse(185)) - Parsing command: insert into table
> test values(5,8900)
> 2015-03-16 10:44:30,942 INFO  [main]: parse.ParseDriver
> (ParseDriver.java:parse(206)) - Parse Completed
> 2015-03-16 10:44:30,942 INFO  [main]: log.PerfLogger
> (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
> start=1426482870940 end=1426482870942 duration=2
> from=org.apache.hadoop.hive.ql.Driver>
>
>
>
>
>
>
>
>
>
> Thanks & Regards
> Amithsha
>
>
> On Fri, Mar 13, 2015 at 7:36 PM, Xuefu Zhang <xz...@cloudera.com> wrote:
> > You need to copy the spark-assembly.jar to your hive/lib.
> >
> > Also, you can check hive.log to get more messages.
> >
> > On Fri, Mar 13, 2015 at 4:51 AM, Amith sha <am...@gmail.com> wrote:
> >>
> >> Hi all,
> >>
> >>
> >> Recently i have configured Spark 1.2.0 and my environment is hadoop
> >> 2.6.0 hive 1.1.0 Here i have tried hive on Spark while executing
> >> insert into i am getting the following g error.
> >>
> >> Query ID = hadoop2_20150313162828_8764adad-a8e4-49da-9ef5-35e4ebd6bc63
> >> Total jobs = 1
> >> Launching Job 1 out of 1
> >> In order to change the average load for a reducer (in bytes):
> >>   set hive.exec.reducers.bytes.per.reducer=<number>
> >> In order to limit the maximum number of reducers:
> >>   set hive.exec.reducers.max=<number>
> >> In order to set a constant number of reducers:
> >>   set mapreduce.job.reduces=<number>
> >> Failed to execute spark task, with exception
> >> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create
> >> spark client.)'
> >> FAILED: Execution Error, return code 1 from
> >> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> >>
> >>
> >>
> >> Have added the spark-assembly jar in hive lib
> >> And also in hive console using the command add jar followed by the
> steps
> >>
> >> set spark.home=/opt/spark-1.2.1/;
> >>
> >>
> >> add jar
> >>
> /opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar;
> >>
> >>
> >>
> >> set hive.execution.engine=spark;
> >>
> >>
> >> set spark.master=spark://xxxxxxx:7077;
> >>
> >>
> >> set spark.eventLog.enabled=true;
> >>
> >>
> >> set spark.executor.memory=512m;
> >>
> >>
> >> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
> >>
> >> Can anyone suggest!!!!
> >>
> >>
> >>
> >> Thanks & Regards
> >> Amithsha
> >
> >
>

Re: Hive on Spark

Posted by Amith sha <am...@gmail.com>.
Hi,

I have already added the spark-assembly jar in hive lib & here is my hive log


2015-03-16 10:40:08,299 INFO  [main]: SessionState
(SessionState.java:printInfo(852)) - Added
[/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
to class path
2015-03-16 10:40:08,300 INFO  [main]: SessionState
(SessionState.java:printInfo(852)) - Added resources:
[/opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar]
2015-03-16 10:40:36,914 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:36,915 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:36,916 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:36,916 INFO  [main]: parse.ParseDriver
(ParseDriver.java:parse(185)) - Parsing command: insert into table
test values(5,8900)
2015-03-16 10:40:36,917 INFO  [main]: parse.ParseDriver
(ParseDriver.java:parse(206)) - Parse Completed
2015-03-16 10:40:36,925 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
start=1426482636916 end=1426482636925 duration=9
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:36,929 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=semanticAnalyze
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:37,034 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:analyzeInternal(10146)) - Starting Semantic
Analysis
2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:genResolvedParseTree(10129)) - Completed phase
1 of Semantic Analysis
2015-03-16 10:40:37,212 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:getMetaData(1434)) - Get metadata for source
tables
2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:getMetaData(1582)) - Get metadata for
subqueries
2015-03-16 10:40:37,213 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:getMetaData(1606)) - Get metadata for
destination tables
2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]:
metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 2:
source:10.10.10.25 get_table : db=test tbl=test
2015-03-16 10:40:37,214 INFO  [pool-3-thread-2]: HiveMetaStore.audit
(HiveMetaStore.java:logAuditEvent(358)) - ugi=hadoop2
ip=10.10.10.25    cmd=source:10.10.10.25 get_table : db=test tbl=test
2015-03-16 10:40:37,316 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:genResolvedParseTree(10133)) - Completed
getting MetaData in Semantic Analysis
2015-03-16 10:40:37,318 INFO  [main]: parse.BaseSemanticAnalyzer
(CalcitePlanner.java:canHandleAstForCbo(349)) - Not invoking CBO
because the statement has too few joins
2015-03-16 10:40:37,320 INFO  [main]: common.FileUtils
(FileUtils.java:mkdir(501)) - Creating directory if it doesn't exist:
hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1
2015-03-16 10:40:37,429 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:genFileSinkPlan(6474)) - Set stats collection
dir : hdfs://nn01:9000/user/hive/warehouse/test.db/test/.hive-staging_hive_2015-03-16_10-40-36_915_4571608652542611567-1/-ext-10001
2015-03-16 10:40:37,450 INFO  [main]: ppd.OpProcFactory
(OpProcFactory.java:process(657)) - Processing for FS(3)
2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
(OpProcFactory.java:process(657)) - Processing for SEL(2)
2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
(OpProcFactory.java:process(657)) - Processing for SEL(1)
2015-03-16 10:40:37,455 INFO  [main]: ppd.OpProcFactory
(OpProcFactory.java:process(384)) - Processing for TS(0)
2015-03-16 10:40:37,507 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
method=partition-retrieving
from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2015-03-16 10:40:37,510 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
method=partition-retrieving start=1426482637507 end=1426482637510
duration=3 from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2015-03-16 10:40:37,583 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
method=SparkOptimizeOperatorTree
from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
2015-03-16 10:40:37,638 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
method=SparkOptimizeOperatorTree start=1426482637583 end=1426482637638
duration=55 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
2015-03-16 10:40:37,660 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
method=SparkGenerateTaskTree
from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
2015-03-16 10:40:37,711 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
method=SparkGenerateTaskTree start=1426482637640 end=1426482637711
duration=71 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
2015-03-16 10:40:37,715 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
method=SparkOptimizeTaskTree
from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
2015-03-16 10:40:37,737 INFO  [main]: physical.NullScanTaskDispatcher
(NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
where optimization is applicable
2015-03-16 10:40:37,741 INFO  [main]: physical.NullScanTaskDispatcher
(NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
2015-03-16 10:40:37,742 INFO  [main]: physical.NullScanTaskDispatcher
(NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
where optimization is applicable
2015-03-16 10:40:37,744 INFO  [main]: physical.NullScanTaskDispatcher
(NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
2015-03-16 10:40:37,747 INFO  [main]: physical.NullScanTaskDispatcher
(NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans
where optimization is applicable
2015-03-16 10:40:37,754 INFO  [main]: physical.NullScanTaskDispatcher
(NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans
2015-03-16 10:40:37,756 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
method=SparkOptimizeTaskTree start=1426482637715 end=1426482637756
duration=41 from=org.apache.hadoop.hive.ql.parse.spark.SparkCompiler>
2015-03-16 10:40:37,762 INFO  [main]: parse.CalcitePlanner
(SemanticAnalyzer.java:analyzeInternal(10231)) - Completed plan
generation
2015-03-16 10:40:37,762 INFO  [main]: ql.Driver
(Driver.java:compile(433)) - Semantic Analysis Completed
2015-03-16 10:40:37,762 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=semanticAnalyze
start=1426482636929 end=1426482637762 duration=833
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:37,763 INFO  [main]: ql.Driver
(Driver.java:getSchema(239)) - Returning Hive schema:
Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null),
FieldSchema(name:_col1, type:int, comment:null)], properties:null)
2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=compile
start=1426482636915 end=1426482637765 duration=850
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:37,765 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
method=acquireReadWriteLocks from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
method=acquireReadWriteLocks start=1426482637765 end=1426482638193
duration=428 from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:38,193 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.execute
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:38,193 INFO  [main]: ql.Driver
(Driver.java:execute(1317)) - Starting command: insert into table test
values(5,8900)
2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
(SessionState.java:printInfo(852)) - Query ID =
hadoop2_20150316104040_c19975af-9dc4-4af2-bfab-d44619224679
2015-03-16 10:40:38,194 INFO  [main]: ql.Driver
(SessionState.java:printInfo(852)) - Total jobs = 1
2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=TimeToSubmit
start=1426482636914 end=1426482638194 duration=1280
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=runTasks
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:38,194 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG
method=task.SPARK.Stage-1 from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
(SessionState.java:printInfo(852)) - Launching Job 1 out of 1
2015-03-16 10:40:38,195 INFO  [main]: ql.Driver
(Driver.java:launchTask(1630)) - Starting task [Stage-1:MAPRED] in
parallel
2015-03-16 10:40:38,200 INFO  [Thread-49]: hive.metastore
(HiveMetaStoreClient.java:open(365)) - Trying to connect to metastore
with URI thrift://nn01:7099
2015-03-16 10:40:38,208 INFO  [Thread-49]: hive.metastore
(HiveMetaStoreClient.java:open(461)) - Connected to metastore.
2015-03-16 10:40:38,233 INFO  [Thread-49]: session.SessionState
(SessionState.java:start(488)) - No Tez session required at this
point. hive.execution.engine=mr.
2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
(SessionState.java:printInfo(852)) - In order to change the average
load for a reducer (in bytes):
2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
(SessionState.java:printInfo(852)) -   set
hive.exec.reducers.bytes.per.reducer=<number>
2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
(SessionState.java:printInfo(852)) - In order to limit the maximum
number of reducers:
2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
(SessionState.java:printInfo(852)) -   set
hive.exec.reducers.max=<number>
2015-03-16 10:40:38,234 INFO  [Thread-49]: exec.Task
(SessionState.java:printInfo(852)) - In order to set a constant number
of reducers:
2015-03-16 10:40:38,235 INFO  [Thread-49]: exec.Task
(SessionState.java:printInfo(852)) -   set
mapreduce.job.reduces=<number>
2015-03-16 10:40:38,243 INFO  [Thread-49]:
session.SparkSessionManagerImpl
(SparkSessionManagerImpl.java:setup(82)) - Setting up the session
manager.
2015-03-16 10:40:38,502 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.connect.timeout ->
1000).
2015-03-16 10:40:38,504 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.serializer ->
org.apache.spark.serializer.KryoSerializer).
2015-03-16 10:40:38,505 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.eventLog.enabled -> true).
2015-03-16 10:40:38,508 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.rpc.threads -> 8).
2015-03-16 10:40:38,508 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.secret.bits ->
256).
2015-03-16 10:40:38,508 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.home -> /opt/spark-1.2.1/).
2015-03-16 10:40:38,509 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.rpc.max.size ->
52428800).
2015-03-16 10:40:38,509 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.master ->
spark://10.10.10.25:7077).
2015-03-16 10:40:38,509 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.executor.memory -> 512m).
2015-03-16 10:40:38,510 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration
(hive.spark.client.server.connect.timeout -> 90000).
2015-03-16 10:40:39,100 WARN  [Thread-49]: rpc.RpcConfiguration
(RpcConfiguration.java:getServerAddress(123)) - Your hostname, nn01,
resolves to a loopback address, but we couldn't find  any external IP
address!
2015-03-16 10:40:39,101 WARN  [Thread-49]: rpc.RpcConfiguration
(RpcConfiguration.java:getServerAddress(125)) - Set
hive.spark.client.server.address if you need to bind to another
address.
2015-03-16 10:40:39,113 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.connect.timeout ->
1000).
2015-03-16 10:40:39,115 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.serializer ->
org.apache.spark.serializer.KryoSerializer).
2015-03-16 10:40:39,116 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.eventLog.enabled -> true).
2015-03-16 10:40:39,116 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.rpc.threads -> 8).
2015-03-16 10:40:39,117 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.secret.bits ->
256).
2015-03-16 10:40:39,118 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.home -> /opt/spark-1.2.1/).
2015-03-16 10:40:39,119 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration (hive.spark.client.rpc.max.size ->
52428800).
2015-03-16 10:40:39,120 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.master ->
spark://10.10.10.25:7077).
2015-03-16 10:40:39,121 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(113)) - load spark
property from hive configuration (spark.executor.memory -> 512m).
2015-03-16 10:40:39,122 INFO  [Thread-49]:
spark.HiveSparkClientFactory
(HiveSparkClientFactory.java:initiateSparkConf(130)) - load RPC
property from hive configuration
(hive.spark.client.server.connect.timeout -> 90000).
2015-03-16 10:40:43,081 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - Spark assembly has been built with
Hive, including Datanucleus jars on classpath
2015-03-16 10:40:47,867 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
property: hive.spark.client.connect.timeout=1000
2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
property: hive.spark.client.rpc.threads=8
2015-03-16 10:40:47,869 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
property: hive.spark.client.rpc.max.size=52428800
2015-03-16 10:40:47,870 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
property: hive.spark.client.secret.bits=256
2015-03-16 10:40:47,872 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - Warning: Ignoring non-spark config
property: hive.spark.client.server.connect.timeout=90000
2015-03-16 10:40:48,329 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - 15/03/16 10:40:48 INFO
client.RemoteDriver: Connecting to: nn01:53098
2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) - Exception in thread "main"
java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
2015-03-16 10:40:48,379 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:137)
2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:528)
2015-03-16 10:40:48,380 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2015-03-16 10:40:48,384 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
java.lang.reflect.Method.invoke(Method.java:606)
2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
2015-03-16 10:40:48,387 INFO  [stderr-redir-1]: client.SparkClientImpl
(SparkClientImpl.java:run(530)) -     at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2015-03-16 10:40:49,055 WARN  [Driver]: client.SparkClientImpl
(SparkClientImpl.java:run(388)) - Child process exited with code 1.
2015-03-16 10:42:10,399 WARN  [Thread-49]: client.SparkClientImpl
(SparkClientImpl.java:<init>(96)) - Error while waiting for client to
connect.
java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
    at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
    at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
    at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
    at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
Caused by: java.util.concurrent.TimeoutException: Timed out waiting
for client connection.
    at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
    at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
    at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at java.lang.Thread.run(Thread.java:744)
2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
(SessionState.java:printError(861)) - Failed to execute spark task,
with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed
to create spark client.)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
    at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
    at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
Caused by: java.lang.RuntimeException:
java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at com.google.common.base.Throwables.propagate(Throwables.java:156)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
    at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
    at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
    ... 6 more
Caused by: java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
    ... 10 more
Caused by: java.util.concurrent.TimeoutException: Timed out waiting
for client connection.
    at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
    at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
    at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at java.lang.Thread.run(Thread.java:744)

2015-03-16 10:42:10,413 ERROR [Thread-49]: exec.Task
(SparkTask.java:execute(124)) - Failed to execute spark task, with
exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
create spark client.)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116)
    at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:113)
    at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:95)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)
Caused by: java.lang.RuntimeException:
java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at com.google.common.base.Throwables.propagate(Throwables.java:156)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
    at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:88)
    at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:58)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
    ... 6 more
Caused by: java.util.concurrent.ExecutionException:
java.util.concurrent.TimeoutException: Timed out waiting for client
connection.
    at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94)
    ... 10 more
Caused by: java.util.concurrent.TimeoutException: Timed out waiting
for client connection.
    at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:134)
    at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
    at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    at java.lang.Thread.run(Thread.java:744)
2015-03-16 10:42:12,204 ERROR [main]: ql.Driver
(SessionState.java:printError(861)) - FAILED: Execution Error, return
code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=Driver.execute
start=1426482638193 end=1426482732205 duration=94012
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:42:12,205 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:42:12,544 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
start=1426482732205 end=1426482732544 duration=339
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:42:12,583 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks
start=1426482732583 end=1426482732583 duration=0
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:44:30,939 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:44:30,940 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse
from=org.apache.hadoop.hive.ql.Driver>
2015-03-16 10:44:30,941 INFO  [main]: parse.ParseDriver
(ParseDriver.java:parse(185)) - Parsing command: insert into table
test values(5,8900)
2015-03-16 10:44:30,942 INFO  [main]: parse.ParseDriver
(ParseDriver.java:parse(206)) - Parse Completed
2015-03-16 10:44:30,942 INFO  [main]: log.PerfLogger
(PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=parse
start=1426482870940 end=1426482870942 duration=2
from=org.apache.hadoop.hive.ql.Driver>









Thanks & Regards
Amithsha


On Fri, Mar 13, 2015 at 7:36 PM, Xuefu Zhang <xz...@cloudera.com> wrote:
> You need to copy the spark-assembly.jar to your hive/lib.
>
> Also, you can check hive.log to get more messages.
>
> On Fri, Mar 13, 2015 at 4:51 AM, Amith sha <am...@gmail.com> wrote:
>>
>> Hi all,
>>
>>
>> Recently i have configured Spark 1.2.0 and my environment is hadoop
>> 2.6.0 hive 1.1.0 Here i have tried hive on Spark while executing
>> insert into i am getting the following g error.
>>
>> Query ID = hadoop2_20150313162828_8764adad-a8e4-49da-9ef5-35e4ebd6bc63
>> Total jobs = 1
>> Launching Job 1 out of 1
>> In order to change the average load for a reducer (in bytes):
>>   set hive.exec.reducers.bytes.per.reducer=<number>
>> In order to limit the maximum number of reducers:
>>   set hive.exec.reducers.max=<number>
>> In order to set a constant number of reducers:
>>   set mapreduce.job.reduces=<number>
>> Failed to execute spark task, with exception
>> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create
>> spark client.)'
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.spark.SparkTask
>>
>>
>>
>> Have added the spark-assembly jar in hive lib
>> And also in hive console using the command add jar followed by the  steps
>>
>> set spark.home=/opt/spark-1.2.1/;
>>
>>
>> add jar
>> /opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar;
>>
>>
>>
>> set hive.execution.engine=spark;
>>
>>
>> set spark.master=spark://xxxxxxx:7077;
>>
>>
>> set spark.eventLog.enabled=true;
>>
>>
>> set spark.executor.memory=512m;
>>
>>
>> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
>>
>> Can anyone suggest!!!!
>>
>>
>>
>> Thanks & Regards
>> Amithsha
>
>

Re: Hive on Spark

Posted by Xuefu Zhang <xz...@cloudera.com>.
You need to copy the spark-assembly.jar to your hive/lib.

Also, you can check hive.log to get more messages.

On Fri, Mar 13, 2015 at 4:51 AM, Amith sha <am...@gmail.com> wrote:

> Hi all,
>
>
> Recently i have configured Spark 1.2.0 and my environment is hadoop
> 2.6.0 hive 1.1.0 Here i have tried hive on Spark while executing
> insert into i am getting the following g error.
>
> Query ID = hadoop2_20150313162828_8764adad-a8e4-49da-9ef5-35e4ebd6bc63
> Total jobs = 1
> Launching Job 1 out of 1
> In order to change the average load for a reducer (in bytes):
>   set hive.exec.reducers.bytes.per.reducer=<number>
> In order to limit the maximum number of reducers:
>   set hive.exec.reducers.max=<number>
> In order to set a constant number of reducers:
>   set mapreduce.job.reduces=<number>
> Failed to execute spark task, with exception
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create
> spark client.)'
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
>
>
>
> Have added the spark-assembly jar in hive lib
> And also in hive console using the command add jar followed by the  steps
>
> set spark.home=/opt/spark-1.2.1/;
>
>
> add jar
> /opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar;
>
>
>
> set hive.execution.engine=spark;
>
>
> set spark.master=spark://xxxxxxx:7077;
>
>
> set spark.eventLog.enabled=true;
>
>
> set spark.executor.memory=512m;
>
>
> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
>
> Can anyone suggest!!!!
>
>
>
> Thanks & Regards
> Amithsha
>