You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheng Lian (JIRA)" <ji...@apache.org> on 2015/01/05 11:39:34 UTC

[jira] [Commented] (SPARK-4908) Spark SQL built for Hive 13 fails under concurrent metadata queries

    [ https://issues.apache.org/jira/browse/SPARK-4908?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14264469#comment-14264469 ] 

Cheng Lian commented on SPARK-4908:
-----------------------------------

Would like to add a comment about the root cause of this issue.  When serving a HiveQL query, Spark SQL's {{HiveContext.runHive}} method gets a {{org.apache.hadoop.hive.ql.Driver}} instance via {{CommandProcessFactory.get}}, which creates and caches {{Driver}} instances. In the case of {{HiveThriftServer2}}, {{HiveContext.runHive}} is called by multiple threads owned by a threaded executor of the Thrift server. However, {{Driver}} is not thread safe, but cached {{Driver}} instance can be accessed by multiple threads, thus causes problem. PR #3834 fixes this issue by synchronizing {{HiveContext.runHive}}, which is valid.  On the other hand, HiveServer2 actually create a new {{Driver}} instance for every served SQL query when initializing a {{SQLOperation}}.

[~dyross] When built against Hive 0.12.0, Spark SQL 1.2.0 also suffers this issue. The snippet doesn't show this because Hive 0.12.0 JDBC driver doesn't execute a {{USE <db>}} statement to switch current database even if the JDBC connection URL specifies a database name. If you replace the lines in the {{try}} block with:
{code}
      val conn = DriverManager.getConnection(url)
      val stmt = conn.createStatement()
      stmt.execute("use hello;")
      stmt.close()
      println("Finished: " + i)
{code}
you'll see exactly the same exceptions.

> Spark SQL built for Hive 13 fails under concurrent metadata queries
> -------------------------------------------------------------------
>
>                 Key: SPARK-4908
>                 URL: https://issues.apache.org/jira/browse/SPARK-4908
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: David Ross
>            Assignee: Cheng Lian
>            Priority: Blocker
>             Fix For: 1.3.0, 1.2.1
>
>
> We are trunk: {{1.3.0-SNAPSHOT}}, as of this commit: 
> https://github.com/apache/spark/commit/3d0c37b8118f6057a663f959321a79b8061132b6
> We are using Spark built for Hive 13, using this option:
> {{-Phive-0.13.1}}
> In single-threaded mode, normal operations look fine. However, under concurrency, with at least 2 concurrent connections, metadata queries fail.
> For example, {{USE some_db}}, {{SHOW TABLES}}, and the implicit {{USE}} statement when you pass a default schema in the JDBC URL, all fail.
> {{SELECT}} queries like {{SELECT * FROM some_table}} do not have this issue.
> Here is some example code:
> {code}
> object main extends App {
>   import java.sql._
>   import scala.concurrent._
>   import scala.concurrent.duration._
>   import scala.concurrent.ExecutionContext.Implicits.global
>   Class.forName("org.apache.hive.jdbc.HiveDriver")
>   val host = "localhost" // update this
>   val url = s"jdbc:hive2://${host}:10511/some_db" // update this
>   val future = Future.traverse(1 to 3) { i =>
>     Future {
>       println("Starting: " + i)
>       try {
>         val conn = DriverManager.getConnection(url)
>       } catch {
>         case e: Throwable => e.printStackTrace()
>         println("Failed: " + i)
>       }
>       println("Finishing: " + i)
>     }
>   }
>   Await.result(future, 2.minutes)
>   println("done!")
> }
> {code}
> Here is the output:
> {code}
> Starting: 1
> Starting: 3
> Starting: 2
> java.sql.SQLException: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Operation cancelled
> 	at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:121)
> 	at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:109)
> 	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:231)
> 	at org.apache.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:451)
> 	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:195)
> 	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:270)
> 	at com.atscale.engine.connection.pool.main$$anonfun$30$$anonfun$apply$2.apply$mcV$sp(ConnectionManager.scala:896)
> 	at com.atscale.engine.connection.pool.main$$anonfun$30$$anonfun$apply$2.apply(ConnectionManager.scala:893)
> 	at com.atscale.engine.connection.pool.main$$anonfun$30$$anonfun$apply$2.apply(ConnectionManager.scala:893)
> 	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> 	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> 	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
> 	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Failed: 3
> Finishing: 3
> java.sql.SQLException: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Operation cancelled
> 	at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:121)
> 	at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:109)
> 	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:231)
> 	at org.apache.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:451)
> 	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:195)
> 	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:270)
> 	at com.atscale.engine.connection.pool.main$$anonfun$30$$anonfun$apply$2.apply$mcV$sp(ConnectionManager.scala:896)
> 	at com.atscale.engine.connection.pool.main$$anonfun$30$$anonfun$apply$2.apply(ConnectionManager.scala:893)
> 	at com.atscale.engine.connection.pool.main$$anonfun$30$$anonfun$apply$2.apply(ConnectionManager.scala:893)
> 	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> 	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> 	at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
> 	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Failed: 2
> Finishing: 2
> Finishing: 1
> done!
> {code}
> Here are the errors from Spark Logs:
> {code}
> 14/12/19 21:44:55 INFO thrift.ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V6
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO thriftserver.SparkExecuteStatementOperation: Running query 'use as_adventure'
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parsing command: use as_adventure
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parse Completed
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parsing command: use as_adventure
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parse Completed
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=parse start=1419025495084 end=1419025495084 duration=0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Semantic Analysis Completed
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1419025495084 end=1419025495084 duration=0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=compile start=1419025495084 end=1419025495085 duration=1 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Starting command: use as_adventure
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1419025495084 end=1419025495085 duration=1 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=runTasks start=1419025495085 end=1419025495098 duration=13 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1419025495085 end=1419025495098 duration=13 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: OK
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1419025495098 end=1419025495098 duration=0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=Driver.run start=1419025495084 end=1419025495098 duration=14 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO thrift.ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V6
> 14/12/19 21:44:55 INFO thrift.ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V6
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO thriftserver.SparkExecuteStatementOperation: Running query 'use as_adventure'
> 14/12/19 21:44:55 INFO thriftserver.SparkExecuteStatementOperation: Running query 'use as_adventure'
> 14/12/19 21:44:55 INFO thriftserver.SparkExecuteStatementOperation: Result Schema: List(result#274)
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parsing command: use as_adventure
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parse Completed
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parsing command: use as_adventure
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parse Completed
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=parse start=1419025495165 end=1419025495165 duration=0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Semantic Analysis Completed
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1419025495165 end=1419025495166 duration=1 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=compile start=1419025495165 end=1419025495166 duration=1 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Starting command: use as_adventure
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1419025495165 end=1419025495166 duration=1 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parsing command: use as_adventure
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parse Completed
> 14/12/19 21:44:55 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 WARN ql.Driver: Shutting down task : Stage-0:DDL
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parsing command: use as_adventure
> 14/12/19 21:44:55 INFO parse.ParseDriver: Parse Completed
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=parse start=1419025495173 end=1419025495174 duration=1 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Semantic Analysis Completed
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1419025495174 end=1419025495174 duration=0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=compile start=1419025495172 end=1419025495177 duration=5 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO ql.Driver: Starting command: use as_adventure
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1419025495172 end=1419025495177 duration=5 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=runTasks start=1419025495167 end=1419025495188 duration=21 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 ERROR ql.Driver: FAILED: Operation cancelled
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1419025495166 end=1419025495189 duration=23 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1419025495189 end=1419025495189 duration=0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 WARN ql.Driver: Shutting down task : Stage-0:DDL
> 14/12/19 21:44:55 ERROR hive.HiveContext:
> ======================
> HIVE FAILURE OUTPUT
> ======================
> RDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> FAILED: Hive Internal Error: org.apache.hadoop.hive.ql.metadata.HiveException(FAILED: Operation cancelled)
> org.apache.hadoop.hive.ql.metadata.HiveException: FAILED: Operation cancelled
> 	at org.apache.hadoop.hive.ql.DriverContext.checkShutdown(DriverContext.java:125)
> 	at org.apache.hadoop.hive.ql.DriverContext.launching(DriverContext.java:91)
> 	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1497)
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
> 	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
> 	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
> 	at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> 	at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> OK
> OK
> OK
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> FAILED: Hive Internal Error: java.lang.NullPointerException(null)
> java.lang.NullPointerException
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1194)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
> 	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
> 	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
> 	at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> 	at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> ======================
> END HIVE FAILURE OUTPUT
> ======================
> 14/12/19 21:44:55 ERROR thriftserver.SparkExecuteStatementOperation: Error executing query:
> org.apache.spark.sql.execution.QueryExecutionException: FAILED: Operation cancelled
> 	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:309)
> 	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
> 	at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> 	at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> 14/12/19 21:44:55 WARN thrift.ThriftCLIService: Error executing statement:
> org.apache.hive.service.cli.HiveSQLException: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Operation cancelled
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:192)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=runTasks start=1419025495177 end=1419025495197 duration=20 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 ERROR ql.Driver: FAILED: Operation cancelled
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1419025495177 end=1419025495200 duration=23 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1419025495200 end=1419025495200 duration=0 from=org.apache.hadoop.hive.ql.Driver>
> 14/12/19 21:44:55 ERROR hive.HiveContext:
> ======================
> HIVE FAILURE OUTPUT
> ======================
> ache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> FAILED: Hive Internal Error: org.apache.hadoop.hive.ql.metadata.HiveException(FAILED: Operation cancelled)
> org.apache.hadoop.hive.ql.metadata.HiveException: FAILED: Operation cancelled
> 	at org.apache.hadoop.hive.ql.DriverContext.checkShutdown(DriverContext.java:125)
> 	at org.apache.hadoop.hive.ql.DriverContext.launching(DriverContext.java:91)
> 	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1497)
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
> 	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
> 	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
> 	at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> 	at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> OK
> OK
> OK
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> FAILED: Hive Internal Error: java.lang.NullPointerException(null)
> java.lang.NullPointerException
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1194)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
> 	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
> 	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
> 	at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> 	at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> OK
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> OK
> FAILED: Operation cancelled
> FAILED: Operation cancelled
> ======================
> END HIVE FAILURE OUTPUT
> ======================
> 14/12/19 21:44:55 ERROR thriftserver.SparkExecuteStatementOperation: Error executing query:
> org.apache.spark.sql.execution.QueryExecutionException: FAILED: Operation cancelled
> 	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:309)
> 	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
> 	at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> 	at org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> 	at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:161)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> 14/12/19 21:44:55 WARN thrift.ThriftCLIService: Error executing statement:
> org.apache.hive.service.cli.HiveSQLException: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Operation cancelled
> 	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:192)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:218)
> 	at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:483)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> 	at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
> 	at com.sun.proxy.$Proxy18.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:233)
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
> 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
> 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org