You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dasun Hegoda <da...@gmail.com> on 2016/01/01 17:03:22 UTC

Re: ERROR server.TThreadPoolServer: Error occurred during processing of message

?????

On Tue, Dec 29, 2015 at 12:08 AM, Dasun Hegoda <da...@gmail.com>
wrote:

> Anyone?????
>
> On Sun, Dec 27, 2015 at 11:30 AM, Dasun Hegoda <da...@gmail.com>
> wrote:
>
>> I was able to figure out where the problem is exactly. It's spark.
>> because when I start the hiveserver2 manually and run query it work fine.
>> but when I try to access the hive through spark's thrift port it does not
>> work. throws the above mentioned error.
>>
>> Please help me to fix this.
>>
>> On Sun, Dec 27, 2015 at 11:15 AM, Dasun Hegoda <da...@gmail.com>
>> wrote:
>>
>>> Yes, didn't work for me
>>>
>>>
>>> On Sun, Dec 27, 2015 at 10:56 AM, Ted Yu <yu...@gmail.com> wrote:
>>>
>>>> Have you seen this ?
>>>>
>>>>
>>>> http://stackoverflow.com/questions/30705576/python-cannot-connect-hiveserver2
>>>>
>>>> On Sat, Dec 26, 2015 at 9:09 PM, Dasun Hegoda <da...@gmail.com>
>>>> wrote:
>>>>
>>>>> I'm running apache-hive-1.2.1-bin and spark-1.5.1-bin-hadoop2.6. spark
>>>>> as the hive engine. When I try to connect through JasperStudio using thrift
>>>>> port I get below error. I'm running ubuntu 14.04.
>>>>>
>>>>>     15/12/26 23:36:20 ERROR server.TThreadPoolServer: Error occurred
>>>>> during processing of message.
>>>>>     java.lang.RuntimeException:
>>>>> org.apache.thrift.transport.TSaslTransportException: No data or no sasl
>>>>> data in the stream
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
>>>>>     at
>>>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>     at java.lang.Thread.run(Thread.java:745)
>>>>>     Caused by: org.apache.thrift.transport.TSaslTransportException: No
>>>>> data or no sasl data in the stream
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:328)
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>>>>>     ... 4 more
>>>>>     15/12/26 23:36:20 INFO thrift.ThriftCLIService: Client protocol
>>>>> version: HIVE_CLI_SERVICE_PROTOCOL_V5
>>>>>     15/12/26 23:36:20 INFO session.SessionState: Created local
>>>>> directory: /tmp/c670ff55-01bb-4f6f-a375-d22a13c44eaf_resources
>>>>>     15/12/26 23:36:20 INFO session.SessionState: Created HDFS
>>>>> directory: /tmp/hive/anonymous/c670ff55-01bb-4f6f-a375-d22a13c44eaf
>>>>>     15/12/26 23:36:20 INFO session.SessionState: Created local
>>>>> directory: /tmp/hduser/c670ff55-01bb-4f6f-a375-d22a13c44eaf
>>>>>     15/12/26 23:36:20 INFO session.SessionState: Created HDFS
>>>>> directory:
>>>>> /tmp/hive/anonymous/c670ff55-01bb-4f6f-a375-d22a13c44eaf/_tmp_space.db
>>>>>     15/12/26 23:36:20 INFO
>>>>> thriftserver.SparkExecuteStatementOperation: Running query 'use default'
>>>>> with d842cd88-2fda-42b2-b943-468017e95f37
>>>>>     15/12/26 23:36:20 INFO parse.ParseDriver: Parsing command: use
>>>>> default
>>>>>     15/12/26 23:36:20 INFO parse.ParseDriver: Parse Completed
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG method=Driver.run
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG
>>>>> method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG method=compile
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG method=parse
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO parse.ParseDriver: Parsing command: use
>>>>> default
>>>>>     15/12/26 23:36:20 INFO parse.ParseDriver: Parse Completed
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG method=parse
>>>>> start=1451190980590 end=1451190980591 duration=1
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG
>>>>> method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO metastore.HiveMetaStore: 2: get_database:
>>>>> default
>>>>>     15/12/26 23:36:20 INFO HiveMetaStore.audit: ugi=hduser
>>>>> ip=unknown-ip-addr cmd=get_database: default
>>>>>     15/12/26 23:36:20 INFO metastore.HiveMetaStore: 2: Opening raw
>>>>> store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>>>>>     15/12/26 23:36:20 INFO metastore.ObjectStore: ObjectStore,
>>>>> initialize called
>>>>>     15/12/26 23:36:20 INFO DataNucleus.Query: Reading in results for
>>>>> query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the
>>>>> connection used is closing
>>>>>     15/12/26 23:36:20 INFO metastore.MetaStoreDirectSql: Using direct
>>>>> SQL, underlying DB is DERBY
>>>>>     15/12/26 23:36:20 INFO metastore.ObjectStore: Initialized
>>>>> ObjectStore
>>>>>     15/12/26 23:36:20 INFO ql.Driver: Semantic Analysis Completed
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG
>>>>> method=semanticAnalyze start=1451190980592 end=1451190980620 duration=28
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO ql.Driver: Returning Hive schema:
>>>>> Schema(fieldSchemas:null, properties:null)
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG method=compile
>>>>> start=1451190980588 end=1451190980621 duration=33
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO ql.Driver: Concurrency mode is disabled,
>>>>> not creating a lock manager
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG
>>>>> method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO ql.Driver: Starting
>>>>> command(queryId=hduser_20151226233620_6bc633ef-5c6f-49e4-9300-f79fdf0c357b):
>>>>> use default
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG
>>>>> method=TimeToSubmit start=1451190980588 end=1451190980622 duration=34
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG method=runTasks
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG
>>>>> method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO ql.Driver: Starting task [Stage-0:DDL] in
>>>>> serial mode
>>>>>     15/12/26 23:36:20 INFO metastore.HiveMetaStore: 2: get_database:
>>>>> default
>>>>>     15/12/26 23:36:20 INFO HiveMetaStore.audit: ugi=hduser
>>>>> ip=unknown-ip-addr cmd=get_database: default
>>>>>     15/12/26 23:36:20 INFO metastore.HiveMetaStore: 2: get_database:
>>>>> default
>>>>>     15/12/26 23:36:20 INFO HiveMetaStore.audit: ugi=hduser
>>>>> ip=unknown-ip-addr cmd=get_database: default
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG method=runTasks
>>>>> start=1451190980622 end=1451190980637 duration=15
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG
>>>>> method=Driver.execute start=1451190980621 end=1451190980637 duration=16
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     OK
>>>>>     15/12/26 23:36:20 INFO ql.Driver: OK
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: <PERFLOG
>>>>> method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG
>>>>> method=releaseLocks start=1451190980639 end=1451190980639 duration=0
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO log.PerfLogger: </PERFLOG method=Driver.run
>>>>> start=1451190980587 end=1451190980639 duration=52
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:36:20 INFO
>>>>> thriftserver.SparkExecuteStatementOperation: Result Schema: List(result#28)
>>>>>     15/12/26 23:36:20 INFO
>>>>> thriftserver.SparkExecuteStatementOperation: Running query 'SELECT * FROM
>>>>> service' with 37916038-9856-43eb-8b73-920f9faf738f
>>>>>     15/12/26 23:36:20 INFO parse.ParseDriver: Parsing command: SELECT
>>>>> * FROM service
>>>>>     15/12/26 23:36:20 INFO parse.ParseDriver: Parse Completed
>>>>>     15/12/26 23:36:20 INFO metastore.HiveMetaStore: 2: get_table :
>>>>> db=default tbl=service
>>>>>     15/12/26 23:36:20 INFO HiveMetaStore.audit: ugi=hduser
>>>>> ip=unknown-ip-addr cmd=get_table : db=default tbl=service
>>>>>     15/12/26 23:36:20 ERROR
>>>>> thriftserver.SparkExecuteStatementOperation: Error executing query,
>>>>> currentState RUNNING,
>>>>>     org.apache.spark.sql.AnalysisException: no such table service;
>>>>> line 1 pos 14
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:260)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$7.applyOrElse(Analyzer.scala:268)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$7.applyOrElse(Analyzer.scala:264)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:56)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:249)
>>>>>     at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
>>>>>     at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>>>>>     at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>>>>     at
>>>>> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
>>>>>     at
>>>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
>>>>>     at
>>>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
>>>>>     at scala.collection.TraversableOnce$class.to
>>>>> (TraversableOnce.scala:273)
>>>>>     at scala.collection.AbstractIterator.to(Iterator.scala:1157)
>>>>>     at
>>>>> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
>>>>>     at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
>>>>>     at
>>>>> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
>>>>>     at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:279)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:54)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:264)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:254)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:83)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:80)
>>>>>     at
>>>>> scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
>>>>>     at scala.collection.immutable.List.foldLeft(List.scala:84)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:80)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:72)
>>>>>     at scala.collection.immutable.List.foreach(List.scala:318)
>>>>>     at
>>>>> org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:72)
>>>>>     at
>>>>> org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:916)
>>>>>     at
>>>>> org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:916)
>>>>>     at
>>>>> org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:914)
>>>>>     at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:132)
>>>>>     at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>>>>>     at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:725)
>>>>>     at
>>>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.runInternal(SparkExecuteStatementOperation.scala:224)
>>>>>     at
>>>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(SparkExecuteStatementOperation.scala:144)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:388)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:369)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>     at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>     at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>     at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
>>>>>     at com.sun.proxy.$Proxy47.executeStatement(Unknown Source)
>>>>>     at
>>>>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:261)
>>>>>     at
>>>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486)
>>>>>     at
>>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>>>>>     at
>>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>>>>>     at
>>>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>>>     at
>>>>> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>>>     at
>>>>> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
>>>>>     at
>>>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>     at java.lang.Thread.run(Thread.java:745)
>>>>>     15/12/26 23:36:20 WARN thrift.ThriftCLIService: Error executing
>>>>> statement:
>>>>>     org.apache.hive.service.cli.HiveSQLException:
>>>>> org.apache.spark.sql.AnalysisException: no such table service; line 1 pos 14
>>>>>     at
>>>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.runInternal(SparkExecuteStatementOperation.scala:259)
>>>>>     at
>>>>> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(SparkExecuteStatementOperation.scala:144)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:388)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:369)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>     at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>     at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>     at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>>>>     at
>>>>> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
>>>>>     at com.sun.proxy.$Proxy47.executeStatement(Unknown Source)
>>>>>     at
>>>>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:261)
>>>>>     at
>>>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486)
>>>>>     at
>>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>>>>>     at
>>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>>>>>     at
>>>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>>>     at
>>>>> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>>>     at
>>>>> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
>>>>>     at
>>>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>     at java.lang.Thread.run(Thread.java:745)
>>>>>     15/12/26 23:41:08 ERROR server.TThreadPoolServer: Error occurred
>>>>> during processing of message.
>>>>>     java.lang.RuntimeException:
>>>>> org.apache.thrift.transport.TSaslTransportException: No data or no sasl
>>>>> data in the stream
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
>>>>>     at
>>>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>>     at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>     at java.lang.Thread.run(Thread.java:745)
>>>>>     Caused by: org.apache.thrift.transport.TSaslTransportException: No
>>>>> data or no sasl data in the stream
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:328)
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>>>>>     at
>>>>> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>>>>>     ... 4 more
>>>>>     15/12/26 23:41:08 INFO thrift.ThriftCLIService: Client protocol
>>>>> version: HIVE_CLI_SERVICE_PROTOCOL_V5
>>>>>     15/12/26 23:41:08 INFO session.SessionState: Created local
>>>>> directory: /tmp/aa7ce472-0284-4469-823b-748ef786ab73_resources
>>>>>     15/12/26 23:41:08 INFO session.SessionState: Created HDFS
>>>>> directory: /tmp/hive/anonymous/aa7ce472-0284-4469-823b-748ef786ab73
>>>>>     15/12/26 23:41:08 INFO session.SessionState: Created local
>>>>> directory: /tmp/hduser/aa7ce472-0284-4469-823b-748ef786ab73
>>>>>     15/12/26 23:41:08 INFO session.SessionState: Created HDFS
>>>>> directory:
>>>>> /tmp/hive/anonymous/aa7ce472-0284-4469-823b-748ef786ab73/_tmp_space.db
>>>>>     15/12/26 23:41:08 INFO
>>>>> thriftserver.SparkExecuteStatementOperation: Running query 'use default'
>>>>> with 6a274f01-2a83-44b9-b970-2154792af7a2
>>>>>     15/12/26 23:41:08 INFO parse.ParseDriver: Parsing command: use
>>>>> default
>>>>>     15/12/26 23:41:08 INFO parse.ParseDriver: Parse Completed
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG method=Driver.run
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG
>>>>> method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG method=compile
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG method=parse
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO parse.ParseDriver: Parsing command: use
>>>>> default
>>>>>     15/12/26 23:41:08 INFO parse.ParseDriver: Parse Completed
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG method=parse
>>>>> start=1451191268389 end=1451191268390 duration=1
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG
>>>>> method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO metastore.HiveMetaStore: 2: get_database:
>>>>> default
>>>>>     15/12/26 23:41:08 INFO HiveMetaStore.audit: ugi=hduser
>>>>> ip=unknown-ip-addr cmd=get_database: default
>>>>>     15/12/26 23:41:08 INFO ql.Driver: Semantic Analysis Completed
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG
>>>>> method=semanticAnalyze start=1451191268390 end=1451191268397 duration=7
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO ql.Driver: Returning Hive schema:
>>>>> Schema(fieldSchemas:null, properties:null)
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG method=compile
>>>>> start=1451191268387 end=1451191268398 duration=11
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO ql.Driver: Concurrency mode is disabled,
>>>>> not creating a lock manager
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG
>>>>> method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO ql.Driver: Starting
>>>>> command(queryId=hduser_20151226234108_27b4ad3d-0f88-4a81-83f6-eaf0ef49cd22):
>>>>> use default
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG
>>>>> method=TimeToSubmit start=1451191268387 end=1451191268399 duration=12
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG method=runTasks
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG
>>>>> method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO ql.Driver: Starting task [Stage-0:DDL] in
>>>>> serial mode
>>>>>     15/12/26 23:41:08 INFO metastore.HiveMetaStore: 2: get_database:
>>>>> default
>>>>>     15/12/26 23:41:08 INFO HiveMetaStore.audit: ugi=hduser
>>>>> ip=unknown-ip-addr cmd=get_database: default
>>>>>     15/12/26 23:41:08 INFO metastore.HiveMetaStore: 2: get_database:
>>>>> default
>>>>>     15/12/26 23:41:08 INFO HiveMetaStore.audit: ugi=hduser
>>>>> ip=unknown-ip-addr cmd=get_database: default
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG method=runTasks
>>>>> start=1451191268399 end=1451191268412 duration=13
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG
>>>>> method=Driver.execute start=1451191268398 end=1451191268412 duration=14
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     OK
>>>>>     15/12/26 23:41:08 INFO ql.Driver: OK
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: <PERFLOG
>>>>> method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG
>>>>> method=releaseLocks start=1451191268413 end=1451191268413 duration=0
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO log.PerfLogger: </PERFLOG method=Driver.run
>>>>> start=1451191268387 end=1451191268413 duration=26
>>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>>>     15/12/26 23:41:08 INFO
>>>>> thriftserver.SparkExecuteStatementOperation: Result Schema: List(result#43)
>>>>>
>>>>>
>>>>> Below is the apache-hive-1.2.1-bin/conf/hive-site.xml
>>>>>
>>>>>
>>>>>         <description>
>>>>>                The cluster manager to connect to
>>>>>         </description>
>>>>>       </property>
>>>>>
>>>>>        <property>
>>>>>         <name>spark.serializer</name>
>>>>>         <value>org.apache.spark.serializer.KryoSerializer</value>
>>>>>         <description>
>>>>>                Class to use for serializing objects that will be sent
>>>>> over the network
>>>>>         </description>
>>>>>       </property>
>>>>>
>>>>>
>>>>>
>>>>>     <property>
>>>>>       <name>hive.server2.authentication</name>
>>>>>       <value>NONE</value>
>>>>>       <description>
>>>>>         Client authentication types.
>>>>>            NONE: no authentication check
>>>>>            LDAP: LDAP/AD based authentication
>>>>>            KERBEROS: Kerberos/GSSAPI authentication
>>>>>            CUSTOM: Custom authentication provider
>>>>>                    (Use with property
>>>>> hive.server2.custom.authentication.class)
>>>>>       </description>
>>>>>     </property>
>>>>>
>>>>>     <property>
>>>>>       <name>hive.metastore.sasl.enabled</name>
>>>>>       <value>false</value>
>>>>>       <description>If true, the metastore thrift interface will be
>>>>> secured with SASL. Clients must authenticate with Kerberos.</description>
>>>>>     </property>
>>>>>
>>>>>     <!--Hive server -->
>>>>>     <property>
>>>>>       <name>hive.server2.thrift.port</name>
>>>>>       <value>10000</value>
>>>>>       <description>Port number of HiveServer2 Thrift interface.
>>>>>       Can be overridden by setting
>>>>> $HIVE_SERVER2_THRIFT_PORT</description>
>>>>>     </property>
>>>>>
>>>>>     <property>
>>>>>       <name>hive.server2.thrift.bind.host</name>
>>>>>       <value>192.168.7.87</value>
>>>>>       <description>Bind host on which to run the HiveServer2 Thrift
>>>>> interface.
>>>>>       Can be overridden by setting
>>>>> $HIVE_SERVER2_THRIFT_BIND_HOST</description>
>>>>>     </property>
>>>>>
>>>>> How can I fix this?
>>>>>
>>>>> --
>>>>> Regards,
>>>>> Dasun Hegoda, Software Engineer
>>>>> www.dasunhegoda.com | dasunhegoda@gmail.com
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Regards,
>>> Dasun Hegoda, Software Engineer
>>> www.dasunhegoda.com | dasunhegoda@gmail.com
>>>
>>
>>
>>
>> --
>> Regards,
>> Dasun Hegoda, Software Engineer
>> www.dasunhegoda.com | dasunhegoda@gmail.com
>>
>
>
>
> --
> Regards,
> Dasun Hegoda, Software Engineer
> www.dasunhegoda.com | dasunhegoda@gmail.com
>



-- 
Regards,
Dasun Hegoda, Software Engineer
www.dasunhegoda.com | dasunhegoda@gmail.com