You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by Luke Han <lu...@gmail.com> on 2015/03/03 11:26:23 UTC

Fwd: build cube时出现如下错误

forward to mailing list for further support.

在 2015年3月3日星期二 UTC+8下午6:23:56,futur...@gmail.com写道:
>
> build cube时出现如下错误
>
> [QuartzScheduler_Worker-1]:[2015-03-03 
> 06:15:24,429][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutput(JavaHadoopCmdOutput.java:96)] 
> - Start to execute command:
>  -cubename test_01 -segmentname FULL_BUILD -input 
> /tmp/kylin-d3c34bba-2130-4644-b0ab-e6b440b55d79/test_01/fact_distinct_columns
> [QuartzScheduler_Worker-1]:[2015-03-03 
> 06:15:39,361][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:55)] 
> - Expect 1 and only 1 non-zero file under hdfs://
> sandbox.hortonworks.com:8020/user/kylin/demoTD/P09_ORG, but find 2
> java.lang.IllegalStateException: Expect 1 and only 1 non-zero file under 
> hdfs://sandbox.hortonworks.com:8020/user/kylin/demoTD/P09_ORG, but find 2
>         at 
> com.kylinolap.dict.lookup.HiveTable.findOnlyFile(HiveTable.java:120)
>         at 
> com.kylinolap.dict.lookup.HiveTable.computeHDFSLocation(HiveTable.java:105)
>         at 
> com.kylinolap.dict.lookup.HiveTable.getHDFSLocation(HiveTable.java:79)
>         at 
> com.kylinolap.dict.lookup.HiveTable.getFileTable(HiveTable.java:72)
>         at 
> com.kylinolap.dict.lookup.HiveTable.getSignature(HiveTable.java:67)
>         at 
> com.kylinolap.dict.DictionaryManager.buildDictionary(DictionaryManager.java:158)
>         at 
> com.kylinolap.cube.CubeManager.buildDictionary(CubeManager.java:171)
>         at 
> com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:50)
>         at 
> com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:39)
>         at 
> com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:51)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>         at 
> com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:53)
>         at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
>         at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>         at 
> org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
> [QuartzScheduler_Worker-1]:[2015-03-03 
> 06:15:39,364][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutput(JavaHadoopCmdOutput.java:96)] 
> - Command execute return code 2
>
>

Re: build cube时出现如下错误

Posted by Li Yang <li...@apache.org>.
Btw, the latest version does not require 2) any more, meaning the lookup
table could consist of more than one files. However, it still must be small
enough to stay in memory.

On Tue, Mar 3, 2015 at 10:36 PM, hongbin ma <ma...@apache.org> wrote:

> On Tue, Mar 3, 2015 at 6:26 PM, Luke Han <lu...@gmail.com> wrote:
>
> > CreateDictionaryJob
> >
>
> ​hi, for lookup tables, we assume 1.  they're small in size(compared to
> fact table)​ 2. consist of only one file in hive
>
> please stick to the assumption.
>
> thanks
> hongbin
>

Re: build cube时出现如下错误

Posted by hongbin ma <ma...@apache.org>.
On Tue, Mar 3, 2015 at 6:26 PM, Luke Han <lu...@gmail.com> wrote:

> CreateDictionaryJob
>

​hi, for lookup tables, we assume 1.  they're small in size(compared to
fact table)​ 2. consist of only one file in hive

please stick to the assumption.

thanks
hongbin