You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@atlas.apache.org by Herman Yu <he...@teeupdata.com> on 2016/01/19 21:56:15 UTC

v0.6 on HDP 2.3.2 sandbox

Hi everyone,

I am having some problems with installing and configuring v0.6 on HDP 2.3.2 sandbox.

HDP 2.3.2 comes with v0.5 pre-configured with Ambari. After compiling v0.6 source code, and updated the link /usr/hdp/current/atlas-server to be associated with the new v0.6 folder. I also made necessary changes in amber’s (e.g. those notification related parameters in application.properties). Atlas starts fine (even though with some errors in log file which are kafka related), however, Hive stopped working with the following errors with any operations.

I suspect this is the Hive hook related, after several tries, I figured out that if I update atlas.hook.hive.synchronous from true to false, Hive starts working fine however the hook doesn’t capture any hive queries to atlas.

Did anyone experience the same problem? Also, how does hive know where the jar file of the hook is located? in HDP 2.3.2, I don’t see HIVE_AUX_PATH is appended with <Atlas_home>/bridge/hive, I had to manually append this in Ambari. 


org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Hive Internal Error: com.sun.jersey.api.client.ClientHandlerException(java.io.IOException: java.net.ConnectException: Connection refused)
        at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:315)
        at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:156)
        at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:183)
        at org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:388)
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:369)
        at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:261)
        at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486)
        at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
        at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: com.sun.jersey.api.client.ClientHandlerException: java.io.IOException: java.net.ConnectException: Connection refused
        at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
        at com.sun.jersey.api.client.Client.handle(Client.java:648)
        at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
        at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
        at com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
        at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
        at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
        at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(HiveMetaStoreBridge.java:487)
        at org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
        at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:174)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054)
        at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154)
        ... 15 more


Thanks
Herman.


Re: v0.6 on HDP 2.3.2 sandbox

Posted by Herman Yu <he...@teeupdata.com>.
Hi Shwetha, much appreciated. I missed /etc/atlas/conf in the class path. Now the hook works. Herman. 

> On Jan 21, 2016, at 9:30 AM, Shwetha Shivalingamurthy <ss...@hortonworks.com> wrote:
> 
> Hive-env.sh should contain
> export 
> HADOOP_CLASSPATH=/etc/atlas/conf:/usr/hdp/current/atlas-server/hook/hive:${
> HADOOP_CLASSPATH}
> 
> 
> So, /etc/atlas/conf/ should contain atlas’ application.properties
> 
> 
> Regards,
> Shwetha
> 
> 
> 
> 
> 
> 
> On 21/01/16 7:46 pm, "Herman Yu" <he...@teeupdata.com> wrote:
> 
>> Thanks Shwetha. that’s the reason, the hook is getting application and
>> client configurations from jar files under hook/hive. I must be missing
>> some configuration steps? is it supposed to get those two config files
>> from Hive configuration folder?
>> 
>> Thanks
>> Herman.
>> 
>> 
>>> On Jan 20, 2016, at 11:27 PM, Shwetha Shivalingamurthy
>>> <ss...@hortonworks.com> wrote:
>>> 
>>> Check if the atlas conf is picked up correctly. You can enable DEBUG
>>> logs
>>> for atlas by modifying the hive’s log4j.xml - add log level for package
>>> org.apache.atlas. DEBUG log level will print the location of atlas’
>>> application.properties that’s picked up
>>> 
>>> Regards,
>>> Shwetha
>>> 
>>> 
>>> 
>>> 
>>> 
>>> 
>>> On 21/01/16 12:41 am, "Herman Yu" <he...@teeupdata.com> wrote:
>>> 
>>>> Hi Shewtha,
>>>> 
>>>> Thanks for the link.  I followed the link, but am still getting the
>>>> following error in hiveserver2.log. It seems to me that atlasclient is
>>>> refused when trying to connect to atlas server via http.  I already
>>>> have
>>>> atlas.http.authentication.enabled=false set to false in both
>>>> application
>>>> and client properties files.
>>>> 
>>>> 
>>>> any other places I need to check?
>>>> 
>>>> thanks
>>>> Herman.
>>>> 
>>>> 
>>>> 2016-01-20 13:55:22,568 INFO  [Atlas Logger 3]: hook.HiveHook
>>>> (HiveHook.java:fireAndForget(192)) - Entered Atlas hook for hook type
>>>> POST_EXEC_HOOK operation CREATETABLE
>>>> 2016-01-20 13:55:22,572 INFO  [HiveServer2-Background-Pool:
>>>> Thread-191]:
>>>> log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>>> method=releaseLocks start=1453316122568 end=1453316122572 duration=4
>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>> 2016-01-20 13:55:22,573 INFO  [HiveServer2-Background-Pool:
>>>> Thread-191]:
>>>> log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>>> method=Driver.run start=1453316122342 end=1453316122573 duration=231
>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>> 2016-01-20 13:55:22,634 INFO  [Atlas Logger 3]:
>>>> security.SecureClientUtils
>>>> (SecureClientUtils.java:getClientConnectionHandler(90)) - Real User:
>>>> hive
>>>> (auth:SIMPLE), is from ticket cache? false
>>>> 2016-01-20 13:55:22,635 INFO  [Atlas Logger 3]:
>>>> security.SecureClientUtils
>>>> (SecureClientUtils.java:getClientConnectionHandler(93)) - doAsUser: hue
>>>> 2016-01-20 13:55:22,635 INFO  [Atlas Logger 2]: hook.HiveHook
>>>> (HiveHook.java:run(182)) - Atlas hook failed
>>>> com.sun.jersey.api.client.ClientHandlerException: java.io.IOException:
>>>> java.net.ConnectException: Connection refused
>>>>      at 
>>>> 
>>>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(UR
>>>> LC
>>>> onnectionClientHandler.java:149)
>>>>      at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>>>      at 
>>>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>>>      at 
>>>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>>>      at 
>>>> 
>>>> com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:62
>>>> 3)
>>>>      at 
>>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>>>      at 
>>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>>>      at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>>>      at 
>>>> 
>>>> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(H
>>>> iv
>>>> eMetaStoreBridge.java:487)
>>>>      at 
>>>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>>>      at 
>>>> org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:66)
>>>>      at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:180)
>>>>      at 
>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>>      at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>>      at 
>>>> 
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>>> a:
>>>> 1145)
>>>>      at 
>>>> 
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>>> va
>>>> :615)
>>>>      at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: java.io.IOException: java.net.ConnectException: Connection
>>>> refused
>>>>      at 
>>>> 
>>>> org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.ja
>>>> va
>>>> :106)
>>>>      at 
>>>> 
>>>> org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.ja
>>>> va
>>>> :98)
>>>>      at java.security.AccessController.doPrivileged(Native Method)
>>>>      at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>      at 
>>>> 
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
>>>> n.
>>>> java:1657)
>>>>      at 
>>>> 
>>>> org.apache.atlas.security.SecureClientUtils$1.getHttpURLConnection(Secur
>>>> eC
>>>> lientUtils.java:98)
>>>>      at 
>>>> 
>>>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(U
>>>> RL
>>>> ConnectionClientHandler.java:159)
>>>>      at 
>>>> 
>>>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(UR
>>>> LC
>>>> onnectionClientHandler.java:147)
>>>>      ... 16 more
>>>> Caused by: java.net.ConnectException: Connection refused
>>>>      at java.net.PlainSocketImpl.socketConnect(Native Method)
>>>>      at 
>>>> 
>>>> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:
>>>> 33
>>>> 9)
>>>>      at 
>>>> 
>>>> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImp
>>>> l.
>>>> java:200)
>>>>      at 
>>>> 
>>>> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:18
>>>> 2)
>>>>      at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
>>>>      at java.net.Socket.connect(Socket.java:579)
>>>>      at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
>>>>      at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
>>>>      at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
>>>>      at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
>>>>      at sun.net.www.http.HttpClient.New(HttpClient.java:308)
>>>>      at sun.net.www.http.HttpClient.New(HttpClient.java:326)
>>>>      at 
>>>> 
>>>> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConn
>>>> ec
>>>> tion.java:998)
>>>>      at 
>>>> 
>>>> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnecti
>>>> on
>>>> .java:934)
>>>>      at 
>>>> 
>>>> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.ja
>>>> va
>>>> :852)
>>>>      at 
>>>> 
>>>> org.apache.hadoop.security.authentication.client.PseudoAuthenticator.aut
>>>> he
>>>> nticate(PseudoAuthenticator.java:76)
>>>>      at 
>>>> 
>>>> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthentic
>>>> at
>>>> or.authenticate(DelegationTokenAuthenticator.java:128)
>>>>      at 
>>>> 
>>>> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openCo
>>>> nn
>>>> ection(AuthenticatedURL.java:215)
>>>>      at 
>>>> 
>>>> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthentic
>>>> at
>>>> edURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
>>>>      at 
>>>> 
>>>> org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.ja
>>>> va
>>>> :102)
>>>>      ... 23 more
>>>> 
>>>> 
>>>>> On Jan 20, 2016, at 2:35 AM, Shwetha Shivalingamurthy
>>>>> <ss...@hortonworks.com> wrote:
>>>>> 
>>>>> 
>>>>> 
>>>>> http://dev.hortonworks.com.s3.amazonaws.com/HDPDocuments/Ambari-2.2.0.0
>>>>> /b
>>>>> k_
>>>>> ambari_reference_guide/content/ch_integrating_atlas.html has the
>>>>> instructions on setting up 0.6 Atlas with Ambari
>>>>> 
>>>>> 
>>>>> Regards,
>>>>> Shwetha
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> On 20/01/16 2:26 am, "Herman Yu" <he...@teeupdata.com> wrote:
>>>>> 
>>>>>> Hi everyone,
>>>>>> 
>>>>>> I am having some problems with installing and configuring v0.6 on HDP
>>>>>> 2.3.2 sandbox.
>>>>>> 
>>>>>> HDP 2.3.2 comes with v0.5 pre-configured with Ambari. After compiling
>>>>>> v0.6 source code, and updated the link /usr/hdp/current/atlas-server
>>>>>> to
>>>>>> be associated with the new v0.6 folder. I also made necessary changes
>>>>>> in
>>>>>> amber¹s (e.g. those notification related parameters in
>>>>>> application.properties). Atlas starts fine (even though with some
>>>>>> errors
>>>>>> in log file which are kafka related), however, Hive stopped working
>>>>>> with
>>>>>> the following errors with any operations.
>>>>>> 
>>>>>> I suspect this is the Hive hook related, after several tries, I
>>>>>> figured
>>>>>> out that if I update atlas.hook.hive.synchronous from true to false,
>>>>>> Hive
>>>>>> starts working fine however the hook doesn¹t capture any hive queries
>>>>>> to
>>>>>> atlas.
>>>>>> 
>>>>>> Did anyone experience the same problem? Also, how does hive know
>>>>>> where
>>>>>> the jar file of the hook is located? in HDP 2.3.2, I don¹t see
>>>>>> HIVE_AUX_PATH is appended with <Atlas_home>/bridge/hive, I had to
>>>>>> manually append this in Ambari.
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.HiveSQLException: Error while processing
>>>>>> statement: FAILED: Hive Internal Error:
>>>>>> com.sun.jersey.api.client.ClientHandlerException(java.io.IOException:
>>>>>> java.net.ConnectException: Connection refused)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.operation.Operation.toSQLException(Operati
>>>>>> on
>>>>>> .j
>>>>>> ava:315)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperati
>>>>>> on
>>>>>> .j
>>>>>> ava:156)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOper
>>>>>> at
>>>>>> io
>>>>>> n.java:183)
>>>>>>     at 
>>>>>> 
>>>>>> org.apache.hive.service.cli.operation.Operation.run(Operation.java:257
>>>>>> )
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementIn
>>>>>> te
>>>>>> rn
>>>>>> al(HiveSessionImpl.java:388)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(H
>>>>>> iv
>>>>>> eS
>>>>>> essionImpl.java:369)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.jav
>>>>>> a:
>>>>>> 26
>>>>>> 1)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(T
>>>>>> hr
>>>>>> if
>>>>>> tCLIService.java:486)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatem
>>>>>> en
>>>>>> t.
>>>>>> getResult(TCLIService.java:1313)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatem
>>>>>> en
>>>>>> t.
>>>>>> getResult(TCLIService.java:1298)
>>>>>>     at 
>>>>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>>>>     at 
>>>>>> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddr
>>>>>> es
>>>>>> sP
>>>>>> rocessor.java:56)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPo
>>>>>> ol
>>>>>> Se
>>>>>> rver.java:285)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
>>>>>> av
>>>>>> a:
>>>>>> 1145)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.
>>>>>> ja
>>>>>> va
>>>>>> :615)
>>>>>>     at java.lang.Thread.run(Thread.java:745)
>>>>>> Caused by: com.sun.jersey.api.client.ClientHandlerException:
>>>>>> java.io.IOException: java.net.ConnectException: Connection refused
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(
>>>>>> UR
>>>>>> LC
>>>>>> onnectionClientHandler.java:149)
>>>>>>     at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>>>>>     at 
>>>>>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>>>>>     at 
>>>>>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:
>>>>>> 62
>>>>>> 3)
>>>>>>     at 
>>>>>> 
>>>>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>>>>>     at 
>>>>>> 
>>>>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>>>>>     at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel
>>>>>> (H
>>>>>> iv
>>>>>> eMetaStoreBridge.java:487)
>>>>>>     at 
>>>>>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>>>>>     at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:174)
>>>>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522)
>>>>>>     at 
>>>>>> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
>>>>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>>>>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054)
>>>>>>     at 
>>>>>> 
>>>>>> 
>>>>>> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperati
>>>>>> on
>>>>>> .j
>>>>>> ava:154)
>>>>>>     ... 15 more
>>>>>> 
>>>>>> 
>>>>>> Thanks
>>>>>> Herman.
>>>>>> 
>>>>> 
>>>> 
>>> 
>> 
> 


Re: v0.6 on HDP 2.3.2 sandbox

Posted by Shwetha Shivalingamurthy <ss...@hortonworks.com>.
Hive-env.sh should contain
export 
HADOOP_CLASSPATH=/etc/atlas/conf:/usr/hdp/current/atlas-server/hook/hive:${
HADOOP_CLASSPATH}


So, /etc/atlas/conf/ should contain atlas’ application.properties


Regards,
Shwetha






On 21/01/16 7:46 pm, "Herman Yu" <he...@teeupdata.com> wrote:

>Thanks Shwetha. that’s the reason, the hook is getting application and
>client configurations from jar files under hook/hive. I must be missing
>some configuration steps? is it supposed to get those two config files
>from Hive configuration folder?
>
>Thanks
>Herman.
>
>
>> On Jan 20, 2016, at 11:27 PM, Shwetha Shivalingamurthy
>><ss...@hortonworks.com> wrote:
>> 
>> Check if the atlas conf is picked up correctly. You can enable DEBUG
>>logs
>> for atlas by modifying the hive’s log4j.xml - add log level for package
>> org.apache.atlas. DEBUG log level will print the location of atlas’
>> application.properties that’s picked up
>> 
>> Regards,
>> Shwetha
>> 
>> 
>> 
>> 
>> 
>> 
>> On 21/01/16 12:41 am, "Herman Yu" <he...@teeupdata.com> wrote:
>> 
>>> Hi Shewtha,
>>> 
>>> Thanks for the link.  I followed the link, but am still getting the
>>> following error in hiveserver2.log. It seems to me that atlasclient is
>>> refused when trying to connect to atlas server via http.  I already
>>>have
>>> atlas.http.authentication.enabled=false set to false in both
>>>application
>>> and client properties files.
>>> 
>>> 
>>> any other places I need to check?
>>> 
>>> thanks
>>> Herman.
>>> 
>>> 
>>> 2016-01-20 13:55:22,568 INFO  [Atlas Logger 3]: hook.HiveHook
>>> (HiveHook.java:fireAndForget(192)) - Entered Atlas hook for hook type
>>> POST_EXEC_HOOK operation CREATETABLE
>>> 2016-01-20 13:55:22,572 INFO  [HiveServer2-Background-Pool:
>>>Thread-191]:
>>> log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>> method=releaseLocks start=1453316122568 end=1453316122572 duration=4
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2016-01-20 13:55:22,573 INFO  [HiveServer2-Background-Pool:
>>>Thread-191]:
>>> log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>>> method=Driver.run start=1453316122342 end=1453316122573 duration=231
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 2016-01-20 13:55:22,634 INFO  [Atlas Logger 3]:
>>> security.SecureClientUtils
>>> (SecureClientUtils.java:getClientConnectionHandler(90)) - Real User:
>>>hive
>>> (auth:SIMPLE), is from ticket cache? false
>>> 2016-01-20 13:55:22,635 INFO  [Atlas Logger 3]:
>>> security.SecureClientUtils
>>> (SecureClientUtils.java:getClientConnectionHandler(93)) - doAsUser: hue
>>> 2016-01-20 13:55:22,635 INFO  [Atlas Logger 2]: hook.HiveHook
>>> (HiveHook.java:run(182)) - Atlas hook failed
>>> com.sun.jersey.api.client.ClientHandlerException: java.io.IOException:
>>> java.net.ConnectException: Connection refused
>>>       at 
>>> 
>>>com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(UR
>>>LC
>>> onnectionClientHandler.java:149)
>>>       at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>>       at 
>>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>>       at 
>>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>>       at 
>>> 
>>>com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:62
>>>3)
>>>       at 
>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>>       at 
>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>>       at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>>       at 
>>> 
>>>org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(H
>>>iv
>>> eMetaStoreBridge.java:487)
>>>       at 
>>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>>       at 
>>> org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:66)
>>>       at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:180)
>>>       at 
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>>       at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>       at 
>>> 
>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>>a:
>>> 1145)
>>>       at 
>>> 
>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>>va
>>> :615)
>>>       at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.io.IOException: java.net.ConnectException: Connection
>>> refused
>>>       at 
>>> 
>>>org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.ja
>>>va
>>> :106)
>>>       at 
>>> 
>>>org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.ja
>>>va
>>> :98)
>>>       at java.security.AccessController.doPrivileged(Native Method)
>>>       at javax.security.auth.Subject.doAs(Subject.java:415)
>>>       at 
>>> 
>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
>>>n.
>>> java:1657)
>>>       at 
>>> 
>>>org.apache.atlas.security.SecureClientUtils$1.getHttpURLConnection(Secur
>>>eC
>>> lientUtils.java:98)
>>>       at 
>>> 
>>>com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(U
>>>RL
>>> ConnectionClientHandler.java:159)
>>>       at 
>>> 
>>>com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(UR
>>>LC
>>> onnectionClientHandler.java:147)
>>>       ... 16 more
>>> Caused by: java.net.ConnectException: Connection refused
>>>       at java.net.PlainSocketImpl.socketConnect(Native Method)
>>>       at 
>>> 
>>>java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:
>>>33
>>> 9)
>>>       at 
>>> 
>>>java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImp
>>>l.
>>> java:200)
>>>       at 
>>> 
>>>java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:18
>>>2)
>>>       at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
>>>       at java.net.Socket.connect(Socket.java:579)
>>>       at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
>>>       at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
>>>       at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
>>>       at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
>>>       at sun.net.www.http.HttpClient.New(HttpClient.java:308)
>>>       at sun.net.www.http.HttpClient.New(HttpClient.java:326)
>>>       at 
>>> 
>>>sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConn
>>>ec
>>> tion.java:998)
>>>       at 
>>> 
>>>sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnecti
>>>on
>>> .java:934)
>>>       at 
>>> 
>>>sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.ja
>>>va
>>> :852)
>>>       at 
>>> 
>>>org.apache.hadoop.security.authentication.client.PseudoAuthenticator.aut
>>>he
>>> nticate(PseudoAuthenticator.java:76)
>>>       at 
>>> 
>>>org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthentic
>>>at
>>> or.authenticate(DelegationTokenAuthenticator.java:128)
>>>       at 
>>> 
>>>org.apache.hadoop.security.authentication.client.AuthenticatedURL.openCo
>>>nn
>>> ection(AuthenticatedURL.java:215)
>>>       at 
>>> 
>>>org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthentic
>>>at
>>> edURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
>>>       at 
>>> 
>>>org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.ja
>>>va
>>> :102)
>>>       ... 23 more
>>> 
>>> 
>>>> On Jan 20, 2016, at 2:35 AM, Shwetha Shivalingamurthy
>>>> <ss...@hortonworks.com> wrote:
>>>> 
>>>> 
>>>> 
>>>>http://dev.hortonworks.com.s3.amazonaws.com/HDPDocuments/Ambari-2.2.0.0
>>>>/b
>>>> k_
>>>> ambari_reference_guide/content/ch_integrating_atlas.html has the
>>>> instructions on setting up 0.6 Atlas with Ambari
>>>> 
>>>> 
>>>> Regards,
>>>> Shwetha
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> On 20/01/16 2:26 am, "Herman Yu" <he...@teeupdata.com> wrote:
>>>> 
>>>>> Hi everyone,
>>>>> 
>>>>> I am having some problems with installing and configuring v0.6 on HDP
>>>>> 2.3.2 sandbox.
>>>>> 
>>>>> HDP 2.3.2 comes with v0.5 pre-configured with Ambari. After compiling
>>>>> v0.6 source code, and updated the link /usr/hdp/current/atlas-server
>>>>>to
>>>>> be associated with the new v0.6 folder. I also made necessary changes
>>>>> in
>>>>> amber¹s (e.g. those notification related parameters in
>>>>> application.properties). Atlas starts fine (even though with some
>>>>> errors
>>>>> in log file which are kafka related), however, Hive stopped working
>>>>> with
>>>>> the following errors with any operations.
>>>>> 
>>>>> I suspect this is the Hive hook related, after several tries, I
>>>>>figured
>>>>> out that if I update atlas.hook.hive.synchronous from true to false,
>>>>> Hive
>>>>> starts working fine however the hook doesn¹t capture any hive queries
>>>>> to
>>>>> atlas.
>>>>> 
>>>>> Did anyone experience the same problem? Also, how does hive know
>>>>>where
>>>>> the jar file of the hook is located? in HDP 2.3.2, I don¹t see
>>>>> HIVE_AUX_PATH is appended with <Atlas_home>/bridge/hive, I had to
>>>>> manually append this in Ambari.
>>>>> 
>>>>> 
>>>>> org.apache.hive.service.cli.HiveSQLException: Error while processing
>>>>> statement: FAILED: Hive Internal Error:
>>>>> com.sun.jersey.api.client.ClientHandlerException(java.io.IOException:
>>>>> java.net.ConnectException: Connection refused)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.operation.Operation.toSQLException(Operati
>>>>>on
>>>>> .j
>>>>> ava:315)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperati
>>>>>on
>>>>> .j
>>>>> ava:156)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOper
>>>>>at
>>>>> io
>>>>> n.java:183)
>>>>>      at 
>>>>> 
>>>>>org.apache.hive.service.cli.operation.Operation.run(Operation.java:257
>>>>>)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementIn
>>>>>te
>>>>> rn
>>>>> al(HiveSessionImpl.java:388)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(H
>>>>>iv
>>>>> eS
>>>>> essionImpl.java:369)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.CLIService.executeStatement(CLIService.jav
>>>>>a:
>>>>> 26
>>>>> 1)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(T
>>>>>hr
>>>>> if
>>>>> tCLIService.java:486)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatem
>>>>>en
>>>>> t.
>>>>> getResult(TCLIService.java:1313)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatem
>>>>>en
>>>>> t.
>>>>> getResult(TCLIService.java:1298)
>>>>>      at 
>>>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>>>      at 
>>>>> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddr
>>>>>es
>>>>> sP
>>>>> rocessor.java:56)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPo
>>>>>ol
>>>>> Se
>>>>> rver.java:285)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
>>>>>av
>>>>> a:
>>>>> 1145)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.
>>>>>ja
>>>>> va
>>>>> :615)
>>>>>      at java.lang.Thread.run(Thread.java:745)
>>>>> Caused by: com.sun.jersey.api.client.ClientHandlerException:
>>>>> java.io.IOException: java.net.ConnectException: Connection refused
>>>>>      at 
>>>>> 
>>>>> 
>>>>>com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(
>>>>>UR
>>>>> LC
>>>>> onnectionClientHandler.java:149)
>>>>>      at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>>>>      at 
>>>>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>>>>      at 
>>>>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:
>>>>>62
>>>>> 3)
>>>>>      at 
>>>>> 
>>>>>org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>>>>      at 
>>>>> 
>>>>>org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>>>>      at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel
>>>>>(H
>>>>> iv
>>>>> eMetaStoreBridge.java:487)
>>>>>      at 
>>>>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>>>>      at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:174)
>>>>>      at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522)
>>>>>      at 
>>>>>org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
>>>>>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>>>>>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054)
>>>>>      at 
>>>>> 
>>>>> 
>>>>>org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperati
>>>>>on
>>>>> .j
>>>>> ava:154)
>>>>>      ... 15 more
>>>>> 
>>>>> 
>>>>> Thanks
>>>>> Herman.
>>>>> 
>>>> 
>>> 
>> 
>


Re: v0.6 on HDP 2.3.2 sandbox

Posted by Herman Yu <he...@teeupdata.com>.
Thanks Shwetha. that’s the reason, the hook is getting application and client configurations from jar files under hook/hive. I must be missing some configuration steps? is it supposed to get those two config files from Hive configuration folder?

Thanks
Herman.


> On Jan 20, 2016, at 11:27 PM, Shwetha Shivalingamurthy <ss...@hortonworks.com> wrote:
> 
> Check if the atlas conf is picked up correctly. You can enable DEBUG logs
> for atlas by modifying the hive’s log4j.xml - add log level for package
> org.apache.atlas. DEBUG log level will print the location of atlas’
> application.properties that’s picked up
> 
> Regards,
> Shwetha
> 
> 
> 
> 
> 
> 
> On 21/01/16 12:41 am, "Herman Yu" <he...@teeupdata.com> wrote:
> 
>> Hi Shewtha,
>> 
>> Thanks for the link.  I followed the link, but am still getting the
>> following error in hiveserver2.log. It seems to me that atlasclient is
>> refused when trying to connect to atlas server via http.  I already have
>> atlas.http.authentication.enabled=false set to false in both application
>> and client properties files.
>> 
>> 
>> any other places I need to check?
>> 
>> thanks
>> Herman.
>> 
>> 
>> 2016-01-20 13:55:22,568 INFO  [Atlas Logger 3]: hook.HiveHook
>> (HiveHook.java:fireAndForget(192)) - Entered Atlas hook for hook type
>> POST_EXEC_HOOK operation CREATETABLE
>> 2016-01-20 13:55:22,572 INFO  [HiveServer2-Background-Pool: Thread-191]:
>> log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>> method=releaseLocks start=1453316122568 end=1453316122572 duration=4
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2016-01-20 13:55:22,573 INFO  [HiveServer2-Background-Pool: Thread-191]:
>> log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>> method=Driver.run start=1453316122342 end=1453316122573 duration=231
>> from=org.apache.hadoop.hive.ql.Driver>
>> 2016-01-20 13:55:22,634 INFO  [Atlas Logger 3]:
>> security.SecureClientUtils
>> (SecureClientUtils.java:getClientConnectionHandler(90)) - Real User: hive
>> (auth:SIMPLE), is from ticket cache? false
>> 2016-01-20 13:55:22,635 INFO  [Atlas Logger 3]:
>> security.SecureClientUtils
>> (SecureClientUtils.java:getClientConnectionHandler(93)) - doAsUser: hue
>> 2016-01-20 13:55:22,635 INFO  [Atlas Logger 2]: hook.HiveHook
>> (HiveHook.java:run(182)) - Atlas hook failed
>> com.sun.jersey.api.client.ClientHandlerException: java.io.IOException:
>> java.net.ConnectException: Connection refused
>>       at 
>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLC
>> onnectionClientHandler.java:149)
>>       at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>       at 
>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>       at 
>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>       at 
>> com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
>>       at 
>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>       at 
>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>       at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>       at 
>> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(Hiv
>> eMetaStoreBridge.java:487)
>>       at 
>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>       at 
>> org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:66)
>>       at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:180)
>>       at 
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>       at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>       at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
>> 1145)
>>       at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java
>> :615)
>>       at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.io.IOException: java.net.ConnectException: Connection
>> refused
>>       at 
>> org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java
>> :106)
>>       at 
>> org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java
>> :98)
>>       at java.security.AccessController.doPrivileged(Native Method)
>>       at javax.security.auth.Subject.doAs(Subject.java:415)
>>       at 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
>> java:1657)
>>       at 
>> org.apache.atlas.security.SecureClientUtils$1.getHttpURLConnection(SecureC
>> lientUtils.java:98)
>>       at 
>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URL
>> ConnectionClientHandler.java:159)
>>       at 
>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLC
>> onnectionClientHandler.java:147)
>>       ... 16 more
>> Caused by: java.net.ConnectException: Connection refused
>>       at java.net.PlainSocketImpl.socketConnect(Native Method)
>>       at 
>> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:33
>> 9)
>>       at 
>> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.
>> java:200)
>>       at 
>> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
>>       at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
>>       at java.net.Socket.connect(Socket.java:579)
>>       at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
>>       at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
>>       at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
>>       at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
>>       at sun.net.www.http.HttpClient.New(HttpClient.java:308)
>>       at sun.net.www.http.HttpClient.New(HttpClient.java:326)
>>       at 
>> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnec
>> tion.java:998)
>>       at 
>> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection
>> .java:934)
>>       at 
>> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java
>> :852)
>>       at 
>> org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authe
>> nticate(PseudoAuthenticator.java:76)
>>       at 
>> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticat
>> or.authenticate(DelegationTokenAuthenticator.java:128)
>>       at 
>> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConn
>> ection(AuthenticatedURL.java:215)
>>       at 
>> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticat
>> edURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
>>       at 
>> org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java
>> :102)
>>       ... 23 more
>> 
>> 
>>> On Jan 20, 2016, at 2:35 AM, Shwetha Shivalingamurthy
>>> <ss...@hortonworks.com> wrote:
>>> 
>>> 
>>> http://dev.hortonworks.com.s3.amazonaws.com/HDPDocuments/Ambari-2.2.0.0/b
>>> k_
>>> ambari_reference_guide/content/ch_integrating_atlas.html has the
>>> instructions on setting up 0.6 Atlas with Ambari
>>> 
>>> 
>>> Regards,
>>> Shwetha
>>> 
>>> 
>>> 
>>> 
>>> 
>>> 
>>> On 20/01/16 2:26 am, "Herman Yu" <he...@teeupdata.com> wrote:
>>> 
>>>> Hi everyone,
>>>> 
>>>> I am having some problems with installing and configuring v0.6 on HDP
>>>> 2.3.2 sandbox.
>>>> 
>>>> HDP 2.3.2 comes with v0.5 pre-configured with Ambari. After compiling
>>>> v0.6 source code, and updated the link /usr/hdp/current/atlas-server to
>>>> be associated with the new v0.6 folder. I also made necessary changes
>>>> in
>>>> amber¹s (e.g. those notification related parameters in
>>>> application.properties). Atlas starts fine (even though with some
>>>> errors
>>>> in log file which are kafka related), however, Hive stopped working
>>>> with
>>>> the following errors with any operations.
>>>> 
>>>> I suspect this is the Hive hook related, after several tries, I figured
>>>> out that if I update atlas.hook.hive.synchronous from true to false,
>>>> Hive
>>>> starts working fine however the hook doesn¹t capture any hive queries
>>>> to
>>>> atlas.
>>>> 
>>>> Did anyone experience the same problem? Also, how does hive know where
>>>> the jar file of the hook is located? in HDP 2.3.2, I don¹t see
>>>> HIVE_AUX_PATH is appended with <Atlas_home>/bridge/hive, I had to
>>>> manually append this in Ambari.
>>>> 
>>>> 
>>>> org.apache.hive.service.cli.HiveSQLException: Error while processing
>>>> statement: FAILED: Hive Internal Error:
>>>> com.sun.jersey.api.client.ClientHandlerException(java.io.IOException:
>>>> java.net.ConnectException: Connection refused)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.operation.Operation.toSQLException(Operation
>>>> .j
>>>> ava:315)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation
>>>> .j
>>>> ava:156)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperat
>>>> io
>>>> n.java:183)
>>>>      at 
>>>> org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInte
>>>> rn
>>>> al(HiveSessionImpl.java:388)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(Hiv
>>>> eS
>>>> essionImpl.java:369)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:
>>>> 26
>>>> 1)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(Thr
>>>> if
>>>> tCLIService.java:486)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatemen
>>>> t.
>>>> getResult(TCLIService.java:1313)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatemen
>>>> t.
>>>> getResult(TCLIService.java:1298)
>>>>      at 
>>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>>      at 
>>>> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddres
>>>> sP
>>>> rocessor.java:56)
>>>>      at 
>>>> 
>>>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPool
>>>> Se
>>>> rver.java:285)
>>>>      at 
>>>> 
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>>> a:
>>>> 1145)
>>>>      at 
>>>> 
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>>> va
>>>> :615)
>>>>      at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: com.sun.jersey.api.client.ClientHandlerException:
>>>> java.io.IOException: java.net.ConnectException: Connection refused
>>>>      at 
>>>> 
>>>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(UR
>>>> LC
>>>> onnectionClientHandler.java:149)
>>>>      at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>>>      at 
>>>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>>>      at 
>>>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>>>      at 
>>>> 
>>>> com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:62
>>>> 3)
>>>>      at 
>>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>>>      at 
>>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>>>      at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>>>      at 
>>>> 
>>>> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(H
>>>> iv
>>>> eMetaStoreBridge.java:487)
>>>>      at 
>>>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>>>      at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:174)
>>>>      at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522)
>>>>      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
>>>>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>>>>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054)
>>>>      at 
>>>> 
>>>> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation
>>>> .j
>>>> ava:154)
>>>>      ... 15 more
>>>> 
>>>> 
>>>> Thanks
>>>> Herman.
>>>> 
>>> 
>> 
> 


Re: v0.6 on HDP 2.3.2 sandbox

Posted by Shwetha Shivalingamurthy <ss...@hortonworks.com>.
Check if the atlas conf is picked up correctly. You can enable DEBUG logs
for atlas by modifying the hive’s log4j.xml - add log level for package
org.apache.atlas. DEBUG log level will print the location of atlas’
application.properties that’s picked up

Regards,
Shwetha






On 21/01/16 12:41 am, "Herman Yu" <he...@teeupdata.com> wrote:

>Hi Shewtha,
>
>Thanks for the link.  I followed the link, but am still getting the
>following error in hiveserver2.log. It seems to me that atlasclient is
>refused when trying to connect to atlas server via http.  I already have
>atlas.http.authentication.enabled=false set to false in both application
>and client properties files.
>
>
>any other places I need to check?
>
>thanks
>Herman.
>
>
>2016-01-20 13:55:22,568 INFO  [Atlas Logger 3]: hook.HiveHook
>(HiveHook.java:fireAndForget(192)) - Entered Atlas hook for hook type
>POST_EXEC_HOOK operation CREATETABLE
>2016-01-20 13:55:22,572 INFO  [HiveServer2-Background-Pool: Thread-191]:
>log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>method=releaseLocks start=1453316122568 end=1453316122572 duration=4
>from=org.apache.hadoop.hive.ql.Driver>
>2016-01-20 13:55:22,573 INFO  [HiveServer2-Background-Pool: Thread-191]:
>log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG
>method=Driver.run start=1453316122342 end=1453316122573 duration=231
>from=org.apache.hadoop.hive.ql.Driver>
>2016-01-20 13:55:22,634 INFO  [Atlas Logger 3]:
>security.SecureClientUtils
>(SecureClientUtils.java:getClientConnectionHandler(90)) - Real User: hive
>(auth:SIMPLE), is from ticket cache? false
>2016-01-20 13:55:22,635 INFO  [Atlas Logger 3]:
>security.SecureClientUtils
>(SecureClientUtils.java:getClientConnectionHandler(93)) - doAsUser: hue
>2016-01-20 13:55:22,635 INFO  [Atlas Logger 2]: hook.HiveHook
>(HiveHook.java:run(182)) - Atlas hook failed
>com.sun.jersey.api.client.ClientHandlerException: java.io.IOException:
>java.net.ConnectException: Connection refused
>        at 
>com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLC
>onnectionClientHandler.java:149)
>        at com.sun.jersey.api.client.Client.handle(Client.java:648)
>        at 
>com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>        at 
>com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>        at 
>com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
>        at 
>org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>        at 
>org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>        at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>        at 
>org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(Hiv
>eMetaStoreBridge.java:487)
>        at 
>org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>        at 
>org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:66)
>        at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:180)
>        at 
>java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>        at 
>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
>1145)
>        at 
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java
>:615)
>        at java.lang.Thread.run(Thread.java:745)
>Caused by: java.io.IOException: java.net.ConnectException: Connection
>refused
>        at 
>org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java
>:106)
>        at 
>org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java
>:98)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:415)
>        at 
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
>java:1657)
>        at 
>org.apache.atlas.security.SecureClientUtils$1.getHttpURLConnection(SecureC
>lientUtils.java:98)
>        at 
>com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URL
>ConnectionClientHandler.java:159)
>        at 
>com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLC
>onnectionClientHandler.java:147)
>        ... 16 more
>Caused by: java.net.ConnectException: Connection refused
>        at java.net.PlainSocketImpl.socketConnect(Native Method)
>        at 
>java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:33
>9)
>        at 
>java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.
>java:200)
>        at 
>java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
>        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
>        at java.net.Socket.connect(Socket.java:579)
>        at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
>        at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
>        at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
>        at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
>        at sun.net.www.http.HttpClient.New(HttpClient.java:308)
>        at sun.net.www.http.HttpClient.New(HttpClient.java:326)
>        at 
>sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnec
>tion.java:998)
>        at 
>sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection
>.java:934)
>        at 
>sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java
>:852)
>        at 
>org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authe
>nticate(PseudoAuthenticator.java:76)
>        at 
>org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticat
>or.authenticate(DelegationTokenAuthenticator.java:128)
>        at 
>org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConn
>ection(AuthenticatedURL.java:215)
>        at 
>org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticat
>edURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
>        at 
>org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java
>:102)
>        ... 23 more
>
>
>> On Jan 20, 2016, at 2:35 AM, Shwetha Shivalingamurthy
>><ss...@hortonworks.com> wrote:
>> 
>> 
>>http://dev.hortonworks.com.s3.amazonaws.com/HDPDocuments/Ambari-2.2.0.0/b
>>k_
>> ambari_reference_guide/content/ch_integrating_atlas.html has the
>> instructions on setting up 0.6 Atlas with Ambari
>> 
>> 
>> Regards,
>> Shwetha
>> 
>> 
>> 
>> 
>> 
>> 
>> On 20/01/16 2:26 am, "Herman Yu" <he...@teeupdata.com> wrote:
>> 
>>> Hi everyone,
>>> 
>>> I am having some problems with installing and configuring v0.6 on HDP
>>> 2.3.2 sandbox.
>>> 
>>> HDP 2.3.2 comes with v0.5 pre-configured with Ambari. After compiling
>>> v0.6 source code, and updated the link /usr/hdp/current/atlas-server to
>>> be associated with the new v0.6 folder. I also made necessary changes
>>>in
>>> amber¹s (e.g. those notification related parameters in
>>> application.properties). Atlas starts fine (even though with some
>>>errors
>>> in log file which are kafka related), however, Hive stopped working
>>>with
>>> the following errors with any operations.
>>> 
>>> I suspect this is the Hive hook related, after several tries, I figured
>>> out that if I update atlas.hook.hive.synchronous from true to false,
>>>Hive
>>> starts working fine however the hook doesn¹t capture any hive queries
>>>to
>>> atlas.
>>> 
>>> Did anyone experience the same problem? Also, how does hive know where
>>> the jar file of the hook is located? in HDP 2.3.2, I don¹t see
>>> HIVE_AUX_PATH is appended with <Atlas_home>/bridge/hive, I had to
>>> manually append this in Ambari.
>>> 
>>> 
>>> org.apache.hive.service.cli.HiveSQLException: Error while processing
>>> statement: FAILED: Hive Internal Error:
>>> com.sun.jersey.api.client.ClientHandlerException(java.io.IOException:
>>> java.net.ConnectException: Connection refused)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.operation.Operation.toSQLException(Operation
>>>.j
>>> ava:315)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation
>>>.j
>>> ava:156)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperat
>>>io
>>> n.java:183)
>>>       at 
>>> org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInte
>>>rn
>>> al(HiveSessionImpl.java:388)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(Hiv
>>>eS
>>> essionImpl.java:369)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:
>>>26
>>> 1)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(Thr
>>>if
>>> tCLIService.java:486)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatemen
>>>t.
>>> getResult(TCLIService.java:1313)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatemen
>>>t.
>>> getResult(TCLIService.java:1298)
>>>       at 
>>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>       at 
>>> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>       at 
>>> 
>>>org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddres
>>>sP
>>> rocessor.java:56)
>>>       at 
>>> 
>>>org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPool
>>>Se
>>> rver.java:285)
>>>       at 
>>> 
>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>>>a:
>>> 1145)
>>>       at 
>>> 
>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>>>va
>>> :615)
>>>       at java.lang.Thread.run(Thread.java:745)
>>> Caused by: com.sun.jersey.api.client.ClientHandlerException:
>>> java.io.IOException: java.net.ConnectException: Connection refused
>>>       at 
>>> 
>>>com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(UR
>>>LC
>>> onnectionClientHandler.java:149)
>>>       at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>>       at 
>>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>>       at 
>>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>>       at 
>>> 
>>>com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:62
>>>3)
>>>       at 
>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>>       at 
>>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>>       at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>>       at 
>>> 
>>>org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(H
>>>iv
>>> eMetaStoreBridge.java:487)
>>>       at 
>>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>>       at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:174)
>>>       at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522)
>>>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
>>>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>>>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054)
>>>       at 
>>> 
>>>org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation
>>>.j
>>> ava:154)
>>>       ... 15 more
>>> 
>>> 
>>> Thanks
>>> Herman.
>>> 
>> 
>


Re: v0.6 on HDP 2.3.2 sandbox

Posted by Herman Yu <he...@teeupdata.com>.
Hi Shewtha,

Thanks for the link.  I followed the link, but am still getting the following error in hiveserver2.log. It seems to me that atlasclient is refused when trying to connect to atlas server via http.  I already have atlas.http.authentication.enabled=false set to false in both application and client properties files.


any other places I need to check?

thanks
Herman.


2016-01-20 13:55:22,568 INFO  [Atlas Logger 3]: hook.HiveHook (HiveHook.java:fireAndForget(192)) - Entered Atlas hook for hook type POST_EXEC_HOOK operation CREATETABLE
2016-01-20 13:55:22,572 INFO  [HiveServer2-Background-Pool: Thread-191]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks start=1453316122568 end=1453316122572 duration=4 from=org.apache.hadoop.hive.ql.Driver>
2016-01-20 13:55:22,573 INFO  [HiveServer2-Background-Pool: Thread-191]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=Driver.run start=1453316122342 end=1453316122573 duration=231 from=org.apache.hadoop.hive.ql.Driver>
2016-01-20 13:55:22,634 INFO  [Atlas Logger 3]: security.SecureClientUtils (SecureClientUtils.java:getClientConnectionHandler(90)) - Real User: hive (auth:SIMPLE), is from ticket cache? false
2016-01-20 13:55:22,635 INFO  [Atlas Logger 3]: security.SecureClientUtils (SecureClientUtils.java:getClientConnectionHandler(93)) - doAsUser: hue
2016-01-20 13:55:22,635 INFO  [Atlas Logger 2]: hook.HiveHook (HiveHook.java:run(182)) - Atlas hook failed
com.sun.jersey.api.client.ClientHandlerException: java.io.IOException: java.net.ConnectException: Connection refused
        at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
        at com.sun.jersey.api.client.Client.handle(Client.java:648)
        at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
        at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
        at com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
        at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
        at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
        at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(HiveMetaStoreBridge.java:487)
        at org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
        at org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:66)
        at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:180)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.net.ConnectException: Connection refused
        at org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java:106)
        at org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java:98)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.atlas.security.SecureClientUtils$1.getHttpURLConnection(SecureClientUtils.java:98)
        at com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:159)
        at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:147)
        ... 16 more
Caused by: java.net.ConnectException: Connection refused
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:579)
        at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
        at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
        at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
        at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
        at sun.net.www.http.HttpClient.New(HttpClient.java:308)
        at sun.net.www.http.HttpClient.New(HttpClient.java:326)
        at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:998)
        at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:934)
        at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:852)
        at org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:76)
        at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:128)
        at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
        at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
        at org.apache.atlas.security.SecureClientUtils$1$1.run(SecureClientUtils.java:102)
        ... 23 more


> On Jan 20, 2016, at 2:35 AM, Shwetha Shivalingamurthy <ss...@hortonworks.com> wrote:
> 
> http://dev.hortonworks.com.s3.amazonaws.com/HDPDocuments/Ambari-2.2.0.0/bk_
> ambari_reference_guide/content/ch_integrating_atlas.html has the
> instructions on setting up 0.6 Atlas with Ambari
> 
> 
> Regards,
> Shwetha
> 
> 
> 
> 
> 
> 
> On 20/01/16 2:26 am, "Herman Yu" <he...@teeupdata.com> wrote:
> 
>> Hi everyone,
>> 
>> I am having some problems with installing and configuring v0.6 on HDP
>> 2.3.2 sandbox.
>> 
>> HDP 2.3.2 comes with v0.5 pre-configured with Ambari. After compiling
>> v0.6 source code, and updated the link /usr/hdp/current/atlas-server to
>> be associated with the new v0.6 folder. I also made necessary changes in
>> amber¹s (e.g. those notification related parameters in
>> application.properties). Atlas starts fine (even though with some errors
>> in log file which are kafka related), however, Hive stopped working with
>> the following errors with any operations.
>> 
>> I suspect this is the Hive hook related, after several tries, I figured
>> out that if I update atlas.hook.hive.synchronous from true to false, Hive
>> starts working fine however the hook doesn¹t capture any hive queries to
>> atlas.
>> 
>> Did anyone experience the same problem? Also, how does hive know where
>> the jar file of the hook is located? in HDP 2.3.2, I don¹t see
>> HIVE_AUX_PATH is appended with <Atlas_home>/bridge/hive, I had to
>> manually append this in Ambari.
>> 
>> 
>> org.apache.hive.service.cli.HiveSQLException: Error while processing
>> statement: FAILED: Hive Internal Error:
>> com.sun.jersey.api.client.ClientHandlerException(java.io.IOException:
>> java.net.ConnectException: Connection refused)
>>       at 
>> org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.j
>> ava:315)
>>       at 
>> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.j
>> ava:156)
>>       at 
>> org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperatio
>> n.java:183)
>>       at 
>> org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
>>       at 
>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementIntern
>> al(HiveSessionImpl.java:388)
>>       at 
>> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveS
>> essionImpl.java:369)
>>       at 
>> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:26
>> 1)
>>       at 
>> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(Thrif
>> tCLIService.java:486)
>>       at 
>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.
>> getResult(TCLIService.java:1313)
>>       at 
>> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.
>> getResult(TCLIService.java:1298)
>>       at 
>> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>       at 
>> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>       at 
>> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressP
>> rocessor.java:56)
>>       at 
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolSe
>> rver.java:285)
>>       at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
>> 1145)
>>       at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java
>> :615)
>>       at java.lang.Thread.run(Thread.java:745)
>> Caused by: com.sun.jersey.api.client.ClientHandlerException:
>> java.io.IOException: java.net.ConnectException: Connection refused
>>       at 
>> com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLC
>> onnectionClientHandler.java:149)
>>       at com.sun.jersey.api.client.Client.handle(Client.java:648)
>>       at 
>> com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>>       at 
>> com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>>       at 
>> com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
>>       at 
>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>>       at 
>> org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>>       at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>>       at 
>> org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(Hiv
>> eMetaStoreBridge.java:487)
>>       at 
>> org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>>       at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:174)
>>       at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522)
>>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
>>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054)
>>       at 
>> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.j
>> ava:154)
>>       ... 15 more
>> 
>> 
>> Thanks
>> Herman.
>> 
> 


Re: v0.6 on HDP 2.3.2 sandbox

Posted by Shwetha Shivalingamurthy <ss...@hortonworks.com>.
http://dev.hortonworks.com.s3.amazonaws.com/HDPDocuments/Ambari-2.2.0.0/bk_
ambari_reference_guide/content/ch_integrating_atlas.html has the
instructions on setting up 0.6 Atlas with Ambari


Regards,
Shwetha






On 20/01/16 2:26 am, "Herman Yu" <he...@teeupdata.com> wrote:

>Hi everyone,
>
>I am having some problems with installing and configuring v0.6 on HDP
>2.3.2 sandbox.
>
>HDP 2.3.2 comes with v0.5 pre-configured with Ambari. After compiling
>v0.6 source code, and updated the link /usr/hdp/current/atlas-server to
>be associated with the new v0.6 folder. I also made necessary changes in
>amber¹s (e.g. those notification related parameters in
>application.properties). Atlas starts fine (even though with some errors
>in log file which are kafka related), however, Hive stopped working with
>the following errors with any operations.
>
>I suspect this is the Hive hook related, after several tries, I figured
>out that if I update atlas.hook.hive.synchronous from true to false, Hive
>starts working fine however the hook doesn¹t capture any hive queries to
>atlas.
>
>Did anyone experience the same problem? Also, how does hive know where
>the jar file of the hook is located? in HDP 2.3.2, I don¹t see
>HIVE_AUX_PATH is appended with <Atlas_home>/bridge/hive, I had to
>manually append this in Ambari.
>
>
>org.apache.hive.service.cli.HiveSQLException: Error while processing
>statement: FAILED: Hive Internal Error:
>com.sun.jersey.api.client.ClientHandlerException(java.io.IOException:
>java.net.ConnectException: Connection refused)
>        at 
>org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.j
>ava:315)
>        at 
>org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.j
>ava:156)
>        at 
>org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperatio
>n.java:183)
>        at 
>org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
>        at 
>org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementIntern
>al(HiveSessionImpl.java:388)
>        at 
>org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveS
>essionImpl.java:369)
>        at 
>org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:26
>1)
>        at 
>org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(Thrif
>tCLIService.java:486)
>        at 
>org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.
>getResult(TCLIService.java:1313)
>        at 
>org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.
>getResult(TCLIService.java:1298)
>        at 
>org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>        at 
>org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>        at 
>org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressP
>rocessor.java:56)
>        at 
>org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolSe
>rver.java:285)
>        at 
>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
>1145)
>        at 
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java
>:615)
>        at java.lang.Thread.run(Thread.java:745)
>Caused by: com.sun.jersey.api.client.ClientHandlerException:
>java.io.IOException: java.net.ConnectException: Connection refused
>        at 
>com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLC
>onnectionClientHandler.java:149)
>        at com.sun.jersey.api.client.Client.handle(Client.java:648)
>        at 
>com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
>        at 
>com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
>        at 
>com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
>        at 
>org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:584)
>        at 
>org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:579)
>        at org.apache.atlas.AtlasClient.getType(AtlasClient.java:257)
>        at 
>org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerHiveDataModel(Hiv
>eMetaStoreBridge.java:487)
>        at 
>org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:197)
>        at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:174)
>        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1522)
>        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054)
>        at 
>org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.j
>ava:154)
>        ... 15 more
>
>
>Thanks
>Herman.
>