You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "XiaoXiang Yu (JIRA)" <ji...@apache.org> on 2019/07/02 07:52:00 UTC

[jira] [Commented] (KYLIN-4059) metrics write to hive error

    [ https://issues.apache.org/jira/browse/KYLIN-4059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16876756#comment-16876756 ] 

XiaoXiang Yu commented on KYLIN-4059:
-------------------------------------

I will check it under CDH5.7/Kylin2.6.2. 

> metrics write to hive error
> ---------------------------
>
>                 Key: KYLIN-4059
>                 URL: https://issues.apache.org/jira/browse/KYLIN-4059
>             Project: Kylin
>          Issue Type: Bug
>    Affects Versions: v2.6.2
>            Reporter: chenchen
>            Priority: Major
>         Attachments: 501561537981_.pic_hd.jpg, 511561539223_.pic.jpg
>
>
> my hadoop version is 2.7.1
>  
> I configure cube-planer and other metrics。When the query QPS of the machine is relatively high, the following error will occur(Console Display metrics Data Write Hive Failure)。
> By looking at the source code, kylin triggers an event to notify hiveProducer to write data for each query. According to common sense, hive is not suitable for dealing with this high frequency writing.
> I wonder what the solution is?
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)