You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "chenchen (JIRA)" <ji...@apache.org> on 2019/06/26 08:44:00 UTC

[jira] [Created] (KYLIN-4059) metrics write to hive error

chenchen created KYLIN-4059:
-------------------------------

             Summary: metrics write to hive error
                 Key: KYLIN-4059
                 URL: https://issues.apache.org/jira/browse/KYLIN-4059
             Project: Kylin
          Issue Type: Bug
            Reporter: chenchen
         Attachments: 501561537981_.pic_hd.jpg

my hadoop version is 2.7.1

 

I configure cube-planer and other metrics。When the query QPS of the machine is relatively high, the following error will occur(Console Display metrics Data Write Hive Failure)。


By looking at the source code, kylin triggers an event to notify hiveProducer to write data for each query. According to common sense, hive is not suitable for dealing with this high frequency writing.


I wonder what the solution is?

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)