You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by 耳东 <77...@qq.com> on 2016/05/10 03:13:52 UTC

killed by admin

Hi all:

         When I build the cube, in the second step 'Extract Fact Table Distinct Columns', the log shows 'killed by admin'.
And I could not find any error log.

回复: killed by admin

Posted by 耳东 <77...@qq.com>.
yes, it is because REDUCE capability required is more than the supported max container capability in the cluster. Killing the Job. reduceResourceRequest: <memory:4096, vCores:1> maxContainerCapability:<memory:2048, vCores:8>. 
The problem is solved, when I change the memory to 4096 or a bigger number.




------------------ 原始邮件 ------------------
发件人: "Li Yang";<li...@apache.org>;
发送时间: 2016年5月15日(星期天) 晚上7:35
收件人: "dev"<de...@kylin.apache.org>; 

主题: Re: killed by admin



At least the hadoop job history server should have some traces.

On Tue, May 10, 2016 at 11:13 AM, 耳东 <77...@qq.com> wrote:

> Hi all:
>
>          When I build the cube, in the second step 'Extract Fact Table
> Distinct Columns', the log shows 'killed by admin'.
> And I could not find any error log.

Re: killed by admin

Posted by Li Yang <li...@apache.org>.
At least the hadoop job history server should have some traces.

On Tue, May 10, 2016 at 11:13 AM, 耳东 <77...@qq.com> wrote:

> Hi all:
>
>          When I build the cube, in the second step 'Extract Fact Table
> Distinct Columns', the log shows 'killed by admin'.
> And I could not find any error log.