You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by Santoshakhilesh <sa...@huawei.com> on 2015/03/04 08:28:10 UTC

Cube Build failed at Step 3 , When I choose Hierarchial dimension

Dear All ,

         I am using 0.6.5 branch of Kylin. I was able to build a cube defining normal and derived measures and play with it.

         I have defined a new cube to test hierarchial dimensions and cube build is failed at Step 3 with following log in kylin.log

         I have run the query which kylin provides on webui of cube on hive and it works.

         Please let me know whats going wrong ? Any more info required from me please let me know.



java.lang.NullPointerException
 at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)



[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(SnapshotManager.java:156)] - Loading snapshotTable from /table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot, with loadData: false
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnapshot(SnapshotManager.java:90)] - Identical input FileSignature [path=file:/hive/warehouse/store_dim/stores.txt, size=60, lastModifiedTime=1425039202000], reuse existing snapshot at /table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putResource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store kylin_metadata_qa@hbase<ma...@hbase>)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourceTable(MetadataManager.java:258)] - Reloading SourceTable from folder kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourceTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourceTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourceTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCubeDesc(MetadataManager.java:308)] - Reloading Cube Metadata from folder kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCubeDesc(MetadataManager.java:333)] - Loaded 4 Cube(s)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInvertedIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc from folder kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInvertedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index Desc(s)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:55)] -
java.lang.NullPointerException
 at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
 at com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:60)
 at com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:39)
 at com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:51)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
 at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
 at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
 at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
 at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutput(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putResource(ResourceStore.java:166)] - Saving resource /job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store kylin_metadata_qa@hbase<ma...@hbase>)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putResource(ResourceStore.java:166)] - Saving resource /job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store kylin_metadata_qa@hbase<ma...@hbase>)
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:87)] - Job status for cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been updated.
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input /tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
[QuartzScheduler_Worker-10]:[2015-03-04 19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:89)] - output:Start to execute command:
 -cubename NDim -segmentname FULL_BUILD -input /tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
Command execute return code 2



Regards,
Santosh Akhilesh
Bangalore R&D
HUAWEI TECHNOLOGIES CO.,LTD.

www.huawei.com
-------------------------------------------------------------------------------------------------------------------------------------
This e-mail and its attachments contain confidential information from HUAWEI, which
is intended only for the person or entity whose address is listed above. Any use of the
information contained herein in any way (including, but not limited to, total or partial
disclosure, reproduction, or dissemination) by persons other than the intended
recipient(s) is prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!

Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by "Shi, Shaofeng" <sh...@ebay.com>.
Yes it is supported;

On 3/4/15, 11:56 PM, "Santosh Akhilesh" <sa...@gmail.com> wrote:

>I have another question can I add multiple measures on same column of
>fact?
>Like sum(value) , count(value),....
>On Wed, 4 Mar 2015 at 9:19 pm, Santosh Akhilesh
><sa...@gmail.com>
>wrote:
>
>> Oh looks like it's about last modified time in Json I should put it 0?
>> I am out of office will give a try tmrw. If something else please let me
>> know.
>> On Wed, 4 Mar 2015 at 7:40 pm, Santoshakhilesh <
>> santosh.akhilesh@huawei.com> wrote:
>>
>>> Hi Shaofeng ,
>>>     I have changed the property file to cerate a new htable and
>>> everything was fresh.
>>>     Unless there is some restriction for join for hierarchical
>>>dimension,
>>> I think there is a bug in wizard while creating hierarchical
>>>dimensions ,
>>> it just deletes the join condition on the table on which hierarch
>>>dimension
>>> is added.
>>>
>>>     I  have tried to use the JSON editor , when I open it gives a
>>>sample
>>> json and not the current json of cube. I tried deleting and pasting the
>>> JSON which I get from the cube json view but while saving I get the
>>> following error.
>>>
>>>
>>> Error Message
>>> Overwriting conflict /cube_desc/HierarchyCube.json, expect old TS
>>> 1425496007007, but it is 0
>>>
>>> I had tried deleting the existing cube and then creating using json
>>> editor but same error. I also tried removing the UUID part of the json
>>>as I
>>> thought this you might be generating internally but still the error.
>>>
>>> So what options I am left with , install 0.7 and give a try ?
>>>
>>> I had modified the JSON as below to add the join condition. I also have
>>> another query why the datatype in json is always null ?
>>>
>>> {
>>>   "uuid": "3721b133-f11a-4ffa-98af-8d8688db3706",
>>>   "name": "HierarchyCube",
>>>   "description": "",
>>>   "dimensions": [
>>>     {
>>>       "id": 1,
>>>       "name": "AREA",
>>>       "join": {
>>>         "type": "left",
>>>         "primary_key": [
>>>           "STOREID"
>>>         ],
>>>         "foreign_key": [
>>>           "STOREID"
>>>         ]
>>>       },
>>>       "hierarchy": [
>>>         {
>>>           "level": "1",
>>>           "column": "STATE"
>>>         },
>>>         {
>>>           "level": "2",
>>>           "column": "CITY"
>>>         }
>>>       ],
>>>       "table": "STORE_DIM",
>>>       "column": "{FK}",
>>>       "datatype": null,
>>>       "derived": null
>>>     },
>>>     {
>>>       "id": 2,
>>>       "name": "SALES_FACT.STOREID",
>>>       "join": null,
>>>       "hierarchy": null,
>>>       "table": "SALES_FACT",
>>>       "column": "STOREID",
>>>       "datatype": null,
>>>       "derived": null
>>>     },
>>>     {
>>>       "id": 3,
>>>       "name": "SALES_FACT.ITEMID",
>>>       "join": null,
>>>       "hierarchy": null,
>>>       "table": "SALES_FACT",
>>>       "column": "ITEMID",
>>>       "datatype": null,
>>>       "derived": null
>>>     },
>>>     {
>>>       "id": 4,
>>>       "name": "SALES_FACT.CUSTOMERID",
>>>       "join": null,
>>>       "hierarchy": null,
>>>       "table": "SALES_FACT",
>>>       "column": "CUSTOMERID",
>>>       "datatype": null,
>>>       "derived": null
>>>     },
>>>     {
>>>       "id": 5,
>>>       "name": "CUSTOMER_DIM_DERIVED",
>>>       "join": {
>>>         "type": "left",
>>>         "primary_key": [
>>>           "CUSTOMERID"
>>>         ],
>>>         "foreign_key": [
>>>           "CUSTOMERID"
>>>         ]
>>>       },
>>>       "hierarchy": null,
>>>       "table": "CUSTOMER_DIM",
>>>       "column": "{FK}",
>>>       "datatype": null,
>>>       "derived": [
>>>         "NAME"
>>>       ]
>>>     },
>>>     {
>>>       "id": 6,
>>>       "name": "ITEM_DIM_DERIVED",
>>>       "join": {
>>>         "type": "left",
>>>         "primary_key": [
>>>           "ITEMID"
>>>         ],
>>>         "foreign_key": [
>>>           "ITEMID"
>>>         ]
>>>       },
>>>       "hierarchy": null,
>>>       "table": "ITEM_DIM",
>>>       "column": "{FK}",
>>>       "datatype": null,
>>>       "derived": [
>>>         "BRAND",
>>>         "COLOR"
>>>       ]
>>>     }
>>>   ],
>>>   "measures": [
>>>     {
>>>       "id": 1,
>>>       "name": "_COUNT_",
>>>       "function": {
>>>         "expression": "COUNT",
>>>         "parameter": {
>>>           "type": "constant",
>>>           "value": "1"
>>>         },
>>>         "returntype": "bigint"
>>>       },
>>>       "dependent_measure_ref": null
>>>     },
>>>     {
>>>       "id": 2,
>>>       "name": "TOTALAMOUNT",
>>>       "function": {
>>>         "expression": "SUM",
>>>         "parameter": {
>>>           "type": "column",
>>>           "value": "AMOUNT"
>>>         },
>>>         "returntype": "double"
>>>       },
>>>       "dependent_measure_ref": null
>>>     },
>>>     {
>>>       "id": 3,
>>>       "name": "TOTALQTY",
>>>       "function": {
>>>         "expression": "SUM",
>>>         "parameter": {
>>>           "type": "column",
>>>           "value": "QTY"
>>>         },
>>>         "returntype": "int"
>>>       },
>>>       "dependent_measure_ref": null
>>>     }
>>>   ],
>>>   "rowkey": {
>>>     "rowkey_columns": [
>>>       {
>>>         "column": "STATE",
>>>         "length": 0,
>>>         "dictionary": "true",
>>>         "mandatory": false
>>>       },
>>>       {
>>>         "column": "CITY",
>>>         "length": 0,
>>>         "dictionary": "true",
>>>         "mandatory": false
>>>       },
>>>       {
>>>         "column": "STOREID",
>>>         "length": 0,
>>>         "dictionary": "true",
>>>         "mandatory": false
>>>       },
>>>       {
>>>         "column": "ITEMID",
>>>         "length": 0,
>>>         "dictionary": "true",
>>>         "mandatory": false
>>>       },
>>>       {
>>>         "column": "CUSTOMERID",
>>>         "length": 0,
>>>         "dictionary": "true",
>>>         "mandatory": false
>>>       }
>>>     ],
>>>     "aggregation_groups": [
>>>       [
>>>         "STOREID",
>>>         "ITEMID",
>>>         "CUSTOMERID"
>>>       ],
>>>       [
>>>         "STATE",
>>>         "CITY"
>>>       ]
>>>     ]
>>>   },
>>>   "signature": "OiAo60gPr38KVois4jHGKw==",
>>>   "capacity": "MEDIUM",
>>>   "last_modified": 1425496007007,
>>>   "fact_table": "SALES_FACT",
>>>   "null_string": null,
>>>   "filter_condition": null,
>>>   "cube_partition_desc": {
>>>     "partition_date_column": null,
>>>     "partition_date_start": 0,
>>>     "cube_partition_type": "APPEND"
>>>   },
>>>   "hbase_mapping": {
>>>     "column_family": [
>>>       {
>>>         "name": "F1",
>>>         "columns": [
>>>           {
>>>             "qualifier": "M",
>>>             "measure_refs": [
>>>               "_COUNT_",
>>>               "TOTALAMOUNT",
>>>               "TOTALQTY"
>>>             ]
>>>           }
>>>         ]
>>>       }
>>>     ]
>>>   },
>>>   "notify_list": []
>>> }
>>>
>>>
>>> Regards,
>>> Santosh Akhilesh
>>> Bangalore R&D
>>> HUAWEI TECHNOLOGIES CO.,LTD.
>>>
>>> www.huawei.com
>>> ------------------------------------------------------------
>>> 
>>>------------------------------------------------------------------------
>>>-
>>> This e-mail and its attachments contain confidential information from
>>> HUAWEI, which
>>> is intended only for the person or entity whose address is listed
>>>above.
>>> Any use of the
>>> information contained herein in any way (including, but not limited to,
>>> total or partial
>>> disclosure, reproduction, or dissemination) by persons other than the
>>> intended
>>> recipient(s) is prohibited. If you receive this e-mail in error, please
>>> notify the sender by
>>> phone or email immediately and delete it!
>>>
>>> ________________________________________
>>> From: Shi, Shaofeng [shaoshi@ebay.com]
>>> Sent: Wednesday, March 04, 2015 6:40 PM
>>> To: dev@kylin.incubator.apache.org
>>> Cc: Kulbhushan Rana
>>> Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>>> dimension
>>>
>>> Hi Santosh, 0.6.5 should be stable, but there is an obvious error in
>>>this
>>> cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this
>>>case
>>> the “join” must be specified, while now it is null; This would cause
>>>Kylin
>>> fail to join fact table with the lookup table. Please try to edit the
>>>cube
>>> to add the join condition, let it looks like other dimensions; If the
>>> wizard couldn’t work, try to manually add join and then use the “JSON
>>> Editor” function to update the cube;
>>>
>>> From now on we would suggest user to use 0.7.1; The binary package
>>>will be
>>> much easier to install; and there are many enhancements and bug fixes
>>>in
>>> 0.7.1;
>>>
>>> To get a fresh metadata store, just use a different htable, this is
>>> configurable in kylin.properties:
>>>
>>> kylin.metadata.url=kylin_metadata_qa@hbase
>>>
>>> The default name is kylin_metadata_qa, if change a name, Kylin will
>>>get a
>>> fresh metadata store;
>>>
>>>
>>> On 3/4/15, 8:52 PM, "Santoshakhilesh" <sa...@huawei.com>
>>> wrote:
>>>
>>> >Hi Shaofeng ,
>>> >
>>> >      I had deleted the cube and tried to build again for hierarchy,
>>>now
>>> >it fails in first step itself.
>>> >      I ahve three dimension tables
>>> >      a) customer_dim b_ store_dim c) item_dim
>>> >      I choose left join with fact table while cerating dimensions ,
>>>but
>>> >after cube creation left join on sotre_dim is automatically deleted by
>>> >kylin
>>> >      store dimension has fields stoireid , city , state , I had
>>>tried to
>>> >add the hierarchy dimension (1) State and (2) City.
>>> >
>>> >      I have started facing many issues , even with normal dimensions
>>>the
>>> >kylin query result not matching with hive query.
>>> >      I had also tried to delete all the metadata in hive and hbase (i
>>> >had deleted all the entries) and started from beginning by creating a
>>>new
>>> >project but now problem is persisting.
>>> >
>>> >      Do you suggest me to install binary distribution is it stable
>>> >enough now ? If yes how do I make sure all the previous data is
>>>deleted.
>>> >deleting hive and hbase data is enough or I should delete something
>>>else
>>> >too ?
>>> >
>>> >      Logs are as below. Sorry for long mail.
>>> >
>>> >
>>> >
>>> >
>>> >
>>> >JSON:
>>> >{
>>> >  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
>>> >  "name": "Hierarchy",
>>> >  "description": "",
>>> >  "dimensions": [
>>> >    {
>>> >      "id": 1,
>>> >      "name": "AREA",
>>> >      "join": null,
>>> >      "hierarchy": [
>>> >        {
>>> >          "level": "1",
>>> >          "column": "STATE"
>>> >        },
>>> >        {
>>> >          "level": "2",
>>> >          "column": "CITY"
>>> >        }
>>> >      ],
>>> >      "table": "STORE_DIM",
>>> >      "column": null,
>>> >      "datatype": null,
>>> >      "derived": null
>>> >    },
>>> >    {
>>> >      "id": 2,
>>> >      "name": "CUSTOMER_DIM_DERIVED",
>>> >      "join": {
>>> >        "type": "left",
>>> >        "primary_key": [
>>> >          "CUSTOMERID"
>>> >        ],
>>> >        "foreign_key": [
>>> >          "CUSTOMERID"
>>> >        ]
>>> >      },
>>> >      "hierarchy": null,
>>> >      "table": "CUSTOMER_DIM",
>>> >      "column": "{FK}",
>>> >      "datatype": null,
>>> >      "derived": [
>>> >        "NAME"
>>> >      ]
>>> >    },
>>> >    {
>>> >      "id": 3,
>>> >      "name": "ITEM_DIM_DERIVED",
>>> >      "join": {
>>> >        "type": "left",
>>> >        "primary_key": [
>>> >          "ITEMID"
>>> >        ],
>>> >        "foreign_key": [
>>> >          "ITEMID"
>>> >        ]
>>> >      },
>>> >      "hierarchy": null,
>>> >      "table": "ITEM_DIM",
>>> >      "column": "{FK}",
>>> >      "datatype": null,
>>> >      "derived": [
>>> >        "TYPE",
>>> >        "BRAND",
>>> >        "COLOR"
>>> >      ]
>>> >    }
>>> >  ],
>>> >  "measures": [
>>> >    {
>>> >      "id": 1,
>>> >      "name": "_COUNT_",
>>> >      "function": {
>>> >        "expression": "COUNT",
>>> >        "parameter": {
>>> >          "type": "constant",
>>> >          "value": "1"
>>> >        },
>>> >        "returntype": "bigint"
>>> >      },
>>> >      "dependent_measure_ref": null
>>> >    },
>>> >    {
>>> >      "id": 2,
>>> >      "name": "TOTALAMOUNT",
>>> >      "function": {
>>> >        "expression": "SUM",
>>> >        "parameter": {
>>> >          "type": "column",
>>> >          "value": "AMOUNT"
>>> >        },
>>> >        "returntype": "double"
>>> >      },
>>> >      "dependent_measure_ref": null
>>> >    },
>>> >    {
>>> >      "id": 3,
>>> >      "name": "TOTALQTY",
>>> >      "function": {
>>> >        "expression": "SUM",
>>> >        "parameter": {
>>> >          "type": "column",
>>> >          "value": "QTY"
>>> >        },
>>> >        "returntype": "int"
>>> >      },
>>> >      "dependent_measure_ref": null
>>> >    }
>>> >  ],
>>> >  "rowkey": {
>>> >    "rowkey_columns": [
>>> >      {
>>> >        "column": "STATE",
>>> >        "length": 0,
>>> >        "dictionary": "true",
>>> >        "mandatory": false
>>> >      },
>>> >      {
>>> >        "column": "CITY",
>>> >        "length": 0,
>>> >        "dictionary": "true",
>>> >        "mandatory": false
>>> >      },
>>> >      {
>>> >        "column": "CUSTOMERID",
>>> >        "length": 0,
>>> >        "dictionary": "true",
>>> >        "mandatory": false
>>> >      },
>>> >      {
>>> >        "column": "ITEMID",
>>> >        "length": 0,
>>> >        "dictionary": "true",
>>> >        "mandatory": false
>>> >      }
>>> >    ],
>>> >    "aggregation_groups": [
>>> >      [
>>> >        "CUSTOMERID",
>>> >        "ITEMID"
>>> >      ],
>>> >      [
>>> >        "STATE",
>>> >        "CITY"
>>> >      ]
>>> >    ]
>>> >  },
>>> >  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
>>> >  "capacity": "MEDIUM",
>>> >  "last_modified": 1425492494239,
>>> >  "fact_table": "SALES_FACT",
>>> >  "null_string": null,
>>> >  "filter_condition": null,
>>> >  "cube_partition_desc": {
>>> >    "partition_date_column": null,
>>> >    "partition_date_start": 0,
>>> >    "cube_partition_type": "APPEND"
>>> >  },
>>> >  "hbase_mapping": {
>>> >    "column_family": [
>>> >      {
>>> >        "name": "F1",
>>> >        "columns": [
>>> >          {
>>> >            "qualifier": "M",
>>> >            "measure_refs": [
>>> >              "_COUNT_",
>>> >              "TOTALAMOUNT",
>>> >              "TOTALQTY"
>>> >            ]
>>> >          }
>>> >        ]
>>> >      }
>>> >    ]
>>> >  },
>>> >  "notify_list": []
>>> >}
>>> >
>>> >Logs:
>>> >
>>> >15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration
>>>property
>>> >hive.metastore.local no longer has any effect. Make sure to provide a
>>> >valid value for hive.metastore.uris if you are connecting to a remote
>>> >metastore.
>>> >15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
>>> >hive.metastore.local does not exist
>>> >Logging initialized using configuration in
>>> >jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-common-
>>> 0.14.0.jar!/hive
>>> >-log4j.properties
>>> >SLF4J: Class path contains multiple SLF4J bindings.
>>> >SLF4J: Found binding in
>>> >[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/
>>> slf4j-log4j12-1
>>> >.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> >SLF4J: Found binding in
>>> >[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-0.
>>> 14.0-standalone
>>> >.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> >SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> >explanation.
>>> >SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> >OK
>>> >Time taken: 0.578 seconds
>>> >OK
>>> >Time taken: 0.444 seconds
>>> >FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias
>>>or
>>> >column reference 'STORE_DIM': (possible column names are:
>>> >sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
>>> >sales_fact.qty, sales_fact.amount, customer_dim.customerid,
>>> >customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
>>> >item_dim.color)
>>> >
>>> >
>>> >
>>> >Regards,
>>> >Santosh Akhilesh
>>> >Bangalore R&D
>>> >HUAWEI TECHNOLOGIES CO.,LTD.
>>> >
>>> >www.huawei.com
>>> >-----------------------------------------------------------
>>> ---------------
>>> >-----------------------------------------------------------
>>> >This e-mail and its attachments contain confidential information from
>>> >HUAWEI, which
>>> >is intended only for the person or entity whose address is listed
>>>above.
>>> >Any use of the
>>> >information contained herein in any way (including, but not limited
>>>to,
>>> >total or partial
>>> >disclosure, reproduction, or dissemination) by persons other than the
>>> >intended
>>> >recipient(s) is prohibited. If you receive this e-mail in error,
>>>please
>>> >notify the sender by
>>> >phone or email immediately and delete it!
>>> >
>>> >________________________________________
>>> >From: Shi, Shaofeng [shaoshi@ebay.com]
>>> >Sent: Wednesday, March 04, 2015 5:36 PM
>>> >To: dev@kylin.incubator.apache.org
>>> >Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>>> >dimension
>>> >
>>> >It seems that you have a lookup table which doesn¹t define the join
>>> >relationship; Could you paste the full json of this cube definition?
>>> >
>>> >On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com>
>>> wrote:
>>> >
>>> >>Dear All ,
>>> >>
>>> >>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>>> >>defining normal and derived measures and play with it.
>>> >>
>>> >>         I have defined a new cube to test hierarchial dimensions and
>>> >>cube build is failed at Step 3 with following log in kylin.log
>>> >>
>>> >>         I have run the query which kylin provides on webui of cube
>>>on
>>> >>hive and it works.
>>> >>
>>> >>         Please let me know whats going wrong ? Any more info
>>>required
>>> >>from me please let me know.
>>> >>
>>> >>
>>> >>
>>> >>java.lang.NullPointerException
>>> >> at 
>>>com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>>> >>
>>> >>
>>> >>
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotMana
>>> ger.load(Snapsh
>>> >>o
>>> >>tManager.java:156)] - Loading snapshotTable from
>>> >>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f1
>>> 9283a.snapshot,
>>> >>with loadData: false
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotMana
>>> ger.buildSnapsh
>>> >>o
>>> >>t(SnapshotManager.java:90)] - Identical input FileSignature
>>> >>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>>> >>lastModifiedTime=1425039202000], reuse existing snapshot at
>>> >>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f1
>>> 9283a.snapshot
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,031][DEBUG][com.kylinolap.common.persistence.Reso
>>> urceStore.putRe
>>> >>s
>>> >>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json
>>>(Store
>>> >>kylin_metadata_qa@hbase<ma...@hbase>)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManage
>>> r.reloadAllSour
>>> >>c
>>> >>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>>> >>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManage
>>> r.reloadAllSour
>>> >>c
>>> >>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManage
>>> r.reloadAllSour
>>> >>c
>>> >>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>>> >>from folder 
>>>kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManage
>>> r.reloadAllSour
>>> >>c
>>> >>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager
>>> .reloadAllCubeD
>>> >>e
>>> >>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>>> >>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManage
>>> r.reloadAllCube
>>> >>D
>>> >>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager
>>> .reloadAllInver
>>> >>t
>>> >>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index
>>>Desc
>>> >>from folder
>>> >>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManage
>>> r.reloadAllInve
>>> >>r
>>> >>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index
>>> Desc(s)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGener
>>> atorCLI.process
>>> >>S
>>> >>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of
>>> STORE_DIM
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateD
>>> ictionaryJob.ru
>>> >>n
>>> >>(CreateDictionaryJob.java:55)] -
>>> >>java.lang.NullPointerException
>>> >> at 
>>>com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>>> >> at
>>> >>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegme
>>> nt(DictionaryGe
>>> >>n
>>> >>eratorCLI.java:60)
>>> >> at
>>> >>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegme
>>> nt(DictionaryGe
>>> >>n
>>> >>eratorCLI.java:39)
>>> >> at
>>> >>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(Crea
>>> teDictionaryJob
>>> >>.
>>> >>java:51)
>>> >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>> >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>> >> at 
>>>com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
>>> >> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
>>> >> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>>> >> at
>>> >>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(
>>> SimpleThreadPool.java:
>>> >>5
>>> >>73)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOu
>>> tput.appendOutp
>>> >>u
>>> >>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,166][DEBUG][com.kylinolap.common.persistence.Reso
>>> urceStore.putRe
>>> >>s
>>> >>ource(ResourceStore.java:166)] - Saving resource
>>> >>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>>> >>kylin_metadata_qa@hbase<ma...@hbase>)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,174][DEBUG][com.kylinolap.common.persistence.Reso
>>> urceStore.putRe
>>> >>s
>>> >>ource(ResourceStore.java:166)] - Saving resource
>>> >>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>>> >>kylin_metadata_qa@hbase<ma...@hbase>)
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.
>>> execute(JobFlowNod
>>> >>e
>>> >>.java:87)] - Job status for
>>> >>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>>> >>updated.
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.
>>> execute(JobFlowNod
>>> >>e
>>> >>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>>> >>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_
>>> distinct_column
>>> >>s
>>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>>> >>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.
>>> execute(JobFlowNod
>>> >>e
>>> >>.java:89)] - output:Start to execute command:
>>> >> -cubename NDim -segmentname FULL_BUILD -input
>>> >>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_
>>> distinct_column
>>> >>s
>>> >>Command execute return code 2
>>> >>
>>> >>
>>> >>
>>> >>Regards,
>>> >>Santosh Akhilesh
>>> >>Bangalore R&D
>>> >>HUAWEI TECHNOLOGIES CO.,LTD.
>>> >>
>>> >>www.huawei.com
>>> >>----------------------------------------------------------
>>> ---------------
>>> >>-
>>> >>-----------------------------------------------------------
>>> >>This e-mail and its attachments contain confidential information from
>>> >>HUAWEI, which
>>> >>is intended only for the person or entity whose address is listed
>>>above.
>>> >>Any use of the
>>> >>information contained herein in any way (including, but not limited
>>>to,
>>> >>total or partial
>>> >>disclosure, reproduction, or dissemination) by persons other than the
>>> >>intended
>>> >>recipient(s) is prohibited. If you receive this e-mail in error,
>>>please
>>> >>notify the sender by
>>> >>phone or email immediately and delete it!
>>
>>


Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by Santosh Akhilesh <sa...@gmail.com>.
I have another question can I add multiple measures on same column of fact?
Like sum(value) , count(value),....
On Wed, 4 Mar 2015 at 9:19 pm, Santosh Akhilesh <sa...@gmail.com>
wrote:

> Oh looks like it's about last modified time in Json I should put it 0?
> I am out of office will give a try tmrw. If something else please let me
> know.
> On Wed, 4 Mar 2015 at 7:40 pm, Santoshakhilesh <
> santosh.akhilesh@huawei.com> wrote:
>
>> Hi Shaofeng ,
>>     I have changed the property file to cerate a new htable and
>> everything was fresh.
>>     Unless there is some restriction for join for hierarchical dimension,
>> I think there is a bug in wizard while creating hierarchical dimensions ,
>> it just deletes the join condition on the table on which hierarch dimension
>> is added.
>>
>>     I  have tried to use the JSON editor , when I open it gives a sample
>> json and not the current json of cube. I tried deleting and pasting the
>> JSON which I get from the cube json view but while saving I get the
>> following error.
>>
>>
>> Error Message
>> Overwriting conflict /cube_desc/HierarchyCube.json, expect old TS
>> 1425496007007, but it is 0
>>
>> I had tried deleting the existing cube and then creating using json
>> editor but same error. I also tried removing the UUID part of the json as I
>> thought this you might be generating internally but still the error.
>>
>> So what options I am left with , install 0.7 and give a try ?
>>
>> I had modified the JSON as below to add the join condition. I also have
>> another query why the datatype in json is always null ?
>>
>> {
>>   "uuid": "3721b133-f11a-4ffa-98af-8d8688db3706",
>>   "name": "HierarchyCube",
>>   "description": "",
>>   "dimensions": [
>>     {
>>       "id": 1,
>>       "name": "AREA",
>>       "join": {
>>         "type": "left",
>>         "primary_key": [
>>           "STOREID"
>>         ],
>>         "foreign_key": [
>>           "STOREID"
>>         ]
>>       },
>>       "hierarchy": [
>>         {
>>           "level": "1",
>>           "column": "STATE"
>>         },
>>         {
>>           "level": "2",
>>           "column": "CITY"
>>         }
>>       ],
>>       "table": "STORE_DIM",
>>       "column": "{FK}",
>>       "datatype": null,
>>       "derived": null
>>     },
>>     {
>>       "id": 2,
>>       "name": "SALES_FACT.STOREID",
>>       "join": null,
>>       "hierarchy": null,
>>       "table": "SALES_FACT",
>>       "column": "STOREID",
>>       "datatype": null,
>>       "derived": null
>>     },
>>     {
>>       "id": 3,
>>       "name": "SALES_FACT.ITEMID",
>>       "join": null,
>>       "hierarchy": null,
>>       "table": "SALES_FACT",
>>       "column": "ITEMID",
>>       "datatype": null,
>>       "derived": null
>>     },
>>     {
>>       "id": 4,
>>       "name": "SALES_FACT.CUSTOMERID",
>>       "join": null,
>>       "hierarchy": null,
>>       "table": "SALES_FACT",
>>       "column": "CUSTOMERID",
>>       "datatype": null,
>>       "derived": null
>>     },
>>     {
>>       "id": 5,
>>       "name": "CUSTOMER_DIM_DERIVED",
>>       "join": {
>>         "type": "left",
>>         "primary_key": [
>>           "CUSTOMERID"
>>         ],
>>         "foreign_key": [
>>           "CUSTOMERID"
>>         ]
>>       },
>>       "hierarchy": null,
>>       "table": "CUSTOMER_DIM",
>>       "column": "{FK}",
>>       "datatype": null,
>>       "derived": [
>>         "NAME"
>>       ]
>>     },
>>     {
>>       "id": 6,
>>       "name": "ITEM_DIM_DERIVED",
>>       "join": {
>>         "type": "left",
>>         "primary_key": [
>>           "ITEMID"
>>         ],
>>         "foreign_key": [
>>           "ITEMID"
>>         ]
>>       },
>>       "hierarchy": null,
>>       "table": "ITEM_DIM",
>>       "column": "{FK}",
>>       "datatype": null,
>>       "derived": [
>>         "BRAND",
>>         "COLOR"
>>       ]
>>     }
>>   ],
>>   "measures": [
>>     {
>>       "id": 1,
>>       "name": "_COUNT_",
>>       "function": {
>>         "expression": "COUNT",
>>         "parameter": {
>>           "type": "constant",
>>           "value": "1"
>>         },
>>         "returntype": "bigint"
>>       },
>>       "dependent_measure_ref": null
>>     },
>>     {
>>       "id": 2,
>>       "name": "TOTALAMOUNT",
>>       "function": {
>>         "expression": "SUM",
>>         "parameter": {
>>           "type": "column",
>>           "value": "AMOUNT"
>>         },
>>         "returntype": "double"
>>       },
>>       "dependent_measure_ref": null
>>     },
>>     {
>>       "id": 3,
>>       "name": "TOTALQTY",
>>       "function": {
>>         "expression": "SUM",
>>         "parameter": {
>>           "type": "column",
>>           "value": "QTY"
>>         },
>>         "returntype": "int"
>>       },
>>       "dependent_measure_ref": null
>>     }
>>   ],
>>   "rowkey": {
>>     "rowkey_columns": [
>>       {
>>         "column": "STATE",
>>         "length": 0,
>>         "dictionary": "true",
>>         "mandatory": false
>>       },
>>       {
>>         "column": "CITY",
>>         "length": 0,
>>         "dictionary": "true",
>>         "mandatory": false
>>       },
>>       {
>>         "column": "STOREID",
>>         "length": 0,
>>         "dictionary": "true",
>>         "mandatory": false
>>       },
>>       {
>>         "column": "ITEMID",
>>         "length": 0,
>>         "dictionary": "true",
>>         "mandatory": false
>>       },
>>       {
>>         "column": "CUSTOMERID",
>>         "length": 0,
>>         "dictionary": "true",
>>         "mandatory": false
>>       }
>>     ],
>>     "aggregation_groups": [
>>       [
>>         "STOREID",
>>         "ITEMID",
>>         "CUSTOMERID"
>>       ],
>>       [
>>         "STATE",
>>         "CITY"
>>       ]
>>     ]
>>   },
>>   "signature": "OiAo60gPr38KVois4jHGKw==",
>>   "capacity": "MEDIUM",
>>   "last_modified": 1425496007007,
>>   "fact_table": "SALES_FACT",
>>   "null_string": null,
>>   "filter_condition": null,
>>   "cube_partition_desc": {
>>     "partition_date_column": null,
>>     "partition_date_start": 0,
>>     "cube_partition_type": "APPEND"
>>   },
>>   "hbase_mapping": {
>>     "column_family": [
>>       {
>>         "name": "F1",
>>         "columns": [
>>           {
>>             "qualifier": "M",
>>             "measure_refs": [
>>               "_COUNT_",
>>               "TOTALAMOUNT",
>>               "TOTALQTY"
>>             ]
>>           }
>>         ]
>>       }
>>     ]
>>   },
>>   "notify_list": []
>> }
>>
>>
>> Regards,
>> Santosh Akhilesh
>> Bangalore R&D
>> HUAWEI TECHNOLOGIES CO.,LTD.
>>
>> www.huawei.com
>> ------------------------------------------------------------
>> -------------------------------------------------------------------------
>> This e-mail and its attachments contain confidential information from
>> HUAWEI, which
>> is intended only for the person or entity whose address is listed above.
>> Any use of the
>> information contained herein in any way (including, but not limited to,
>> total or partial
>> disclosure, reproduction, or dissemination) by persons other than the
>> intended
>> recipient(s) is prohibited. If you receive this e-mail in error, please
>> notify the sender by
>> phone or email immediately and delete it!
>>
>> ________________________________________
>> From: Shi, Shaofeng [shaoshi@ebay.com]
>> Sent: Wednesday, March 04, 2015 6:40 PM
>> To: dev@kylin.incubator.apache.org
>> Cc: Kulbhushan Rana
>> Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>> dimension
>>
>> Hi Santosh, 0.6.5 should be stable, but there is an obvious error in this
>> cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this case
>> the “join” must be specified, while now it is null; This would cause Kylin
>> fail to join fact table with the lookup table. Please try to edit the cube
>> to add the join condition, let it looks like other dimensions; If the
>> wizard couldn’t work, try to manually add join and then use the “JSON
>> Editor” function to update the cube;
>>
>> From now on we would suggest user to use 0.7.1; The binary package will be
>> much easier to install; and there are many enhancements and bug fixes in
>> 0.7.1;
>>
>> To get a fresh metadata store, just use a different htable, this is
>> configurable in kylin.properties:
>>
>> kylin.metadata.url=kylin_metadata_qa@hbase
>>
>> The default name is kylin_metadata_qa, if change a name, Kylin will get a
>> fresh metadata store;
>>
>>
>> On 3/4/15, 8:52 PM, "Santoshakhilesh" <sa...@huawei.com>
>> wrote:
>>
>> >Hi Shaofeng ,
>> >
>> >      I had deleted the cube and tried to build again for hierarchy, now
>> >it fails in first step itself.
>> >      I ahve three dimension tables
>> >      a) customer_dim b_ store_dim c) item_dim
>> >      I choose left join with fact table while cerating dimensions , but
>> >after cube creation left join on sotre_dim is automatically deleted by
>> >kylin
>> >      store dimension has fields stoireid , city , state , I had tried to
>> >add the hierarchy dimension (1) State and (2) City.
>> >
>> >      I have started facing many issues , even with normal dimensions the
>> >kylin query result not matching with hive query.
>> >      I had also tried to delete all the metadata in hive and hbase (i
>> >had deleted all the entries) and started from beginning by creating a new
>> >project but now problem is persisting.
>> >
>> >      Do you suggest me to install binary distribution is it stable
>> >enough now ? If yes how do I make sure all the previous data is deleted.
>> >deleting hive and hbase data is enough or I should delete something else
>> >too ?
>> >
>> >      Logs are as below. Sorry for long mail.
>> >
>> >
>> >
>> >
>> >
>> >JSON:
>> >{
>> >  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
>> >  "name": "Hierarchy",
>> >  "description": "",
>> >  "dimensions": [
>> >    {
>> >      "id": 1,
>> >      "name": "AREA",
>> >      "join": null,
>> >      "hierarchy": [
>> >        {
>> >          "level": "1",
>> >          "column": "STATE"
>> >        },
>> >        {
>> >          "level": "2",
>> >          "column": "CITY"
>> >        }
>> >      ],
>> >      "table": "STORE_DIM",
>> >      "column": null,
>> >      "datatype": null,
>> >      "derived": null
>> >    },
>> >    {
>> >      "id": 2,
>> >      "name": "CUSTOMER_DIM_DERIVED",
>> >      "join": {
>> >        "type": "left",
>> >        "primary_key": [
>> >          "CUSTOMERID"
>> >        ],
>> >        "foreign_key": [
>> >          "CUSTOMERID"
>> >        ]
>> >      },
>> >      "hierarchy": null,
>> >      "table": "CUSTOMER_DIM",
>> >      "column": "{FK}",
>> >      "datatype": null,
>> >      "derived": [
>> >        "NAME"
>> >      ]
>> >    },
>> >    {
>> >      "id": 3,
>> >      "name": "ITEM_DIM_DERIVED",
>> >      "join": {
>> >        "type": "left",
>> >        "primary_key": [
>> >          "ITEMID"
>> >        ],
>> >        "foreign_key": [
>> >          "ITEMID"
>> >        ]
>> >      },
>> >      "hierarchy": null,
>> >      "table": "ITEM_DIM",
>> >      "column": "{FK}",
>> >      "datatype": null,
>> >      "derived": [
>> >        "TYPE",
>> >        "BRAND",
>> >        "COLOR"
>> >      ]
>> >    }
>> >  ],
>> >  "measures": [
>> >    {
>> >      "id": 1,
>> >      "name": "_COUNT_",
>> >      "function": {
>> >        "expression": "COUNT",
>> >        "parameter": {
>> >          "type": "constant",
>> >          "value": "1"
>> >        },
>> >        "returntype": "bigint"
>> >      },
>> >      "dependent_measure_ref": null
>> >    },
>> >    {
>> >      "id": 2,
>> >      "name": "TOTALAMOUNT",
>> >      "function": {
>> >        "expression": "SUM",
>> >        "parameter": {
>> >          "type": "column",
>> >          "value": "AMOUNT"
>> >        },
>> >        "returntype": "double"
>> >      },
>> >      "dependent_measure_ref": null
>> >    },
>> >    {
>> >      "id": 3,
>> >      "name": "TOTALQTY",
>> >      "function": {
>> >        "expression": "SUM",
>> >        "parameter": {
>> >          "type": "column",
>> >          "value": "QTY"
>> >        },
>> >        "returntype": "int"
>> >      },
>> >      "dependent_measure_ref": null
>> >    }
>> >  ],
>> >  "rowkey": {
>> >    "rowkey_columns": [
>> >      {
>> >        "column": "STATE",
>> >        "length": 0,
>> >        "dictionary": "true",
>> >        "mandatory": false
>> >      },
>> >      {
>> >        "column": "CITY",
>> >        "length": 0,
>> >        "dictionary": "true",
>> >        "mandatory": false
>> >      },
>> >      {
>> >        "column": "CUSTOMERID",
>> >        "length": 0,
>> >        "dictionary": "true",
>> >        "mandatory": false
>> >      },
>> >      {
>> >        "column": "ITEMID",
>> >        "length": 0,
>> >        "dictionary": "true",
>> >        "mandatory": false
>> >      }
>> >    ],
>> >    "aggregation_groups": [
>> >      [
>> >        "CUSTOMERID",
>> >        "ITEMID"
>> >      ],
>> >      [
>> >        "STATE",
>> >        "CITY"
>> >      ]
>> >    ]
>> >  },
>> >  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
>> >  "capacity": "MEDIUM",
>> >  "last_modified": 1425492494239,
>> >  "fact_table": "SALES_FACT",
>> >  "null_string": null,
>> >  "filter_condition": null,
>> >  "cube_partition_desc": {
>> >    "partition_date_column": null,
>> >    "partition_date_start": 0,
>> >    "cube_partition_type": "APPEND"
>> >  },
>> >  "hbase_mapping": {
>> >    "column_family": [
>> >      {
>> >        "name": "F1",
>> >        "columns": [
>> >          {
>> >            "qualifier": "M",
>> >            "measure_refs": [
>> >              "_COUNT_",
>> >              "TOTALAMOUNT",
>> >              "TOTALQTY"
>> >            ]
>> >          }
>> >        ]
>> >      }
>> >    ]
>> >  },
>> >  "notify_list": []
>> >}
>> >
>> >Logs:
>> >
>> >15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property
>> >hive.metastore.local no longer has any effect. Make sure to provide a
>> >valid value for hive.metastore.uris if you are connecting to a remote
>> >metastore.
>> >15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
>> >hive.metastore.local does not exist
>> >Logging initialized using configuration in
>> >jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-common-
>> 0.14.0.jar!/hive
>> >-log4j.properties
>> >SLF4J: Class path contains multiple SLF4J bindings.
>> >SLF4J: Found binding in
>> >[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/
>> slf4j-log4j12-1
>> >.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> >SLF4J: Found binding in
>> >[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-0.
>> 14.0-standalone
>> >.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> >SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> >explanation.
>> >SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> >OK
>> >Time taken: 0.578 seconds
>> >OK
>> >Time taken: 0.444 seconds
>> >FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or
>> >column reference 'STORE_DIM': (possible column names are:
>> >sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
>> >sales_fact.qty, sales_fact.amount, customer_dim.customerid,
>> >customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
>> >item_dim.color)
>> >
>> >
>> >
>> >Regards,
>> >Santosh Akhilesh
>> >Bangalore R&D
>> >HUAWEI TECHNOLOGIES CO.,LTD.
>> >
>> >www.huawei.com
>> >-----------------------------------------------------------
>> ---------------
>> >-----------------------------------------------------------
>> >This e-mail and its attachments contain confidential information from
>> >HUAWEI, which
>> >is intended only for the person or entity whose address is listed above.
>> >Any use of the
>> >information contained herein in any way (including, but not limited to,
>> >total or partial
>> >disclosure, reproduction, or dissemination) by persons other than the
>> >intended
>> >recipient(s) is prohibited. If you receive this e-mail in error, please
>> >notify the sender by
>> >phone or email immediately and delete it!
>> >
>> >________________________________________
>> >From: Shi, Shaofeng [shaoshi@ebay.com]
>> >Sent: Wednesday, March 04, 2015 5:36 PM
>> >To: dev@kylin.incubator.apache.org
>> >Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>> >dimension
>> >
>> >It seems that you have a lookup table which doesn¹t define the join
>> >relationship; Could you paste the full json of this cube definition?
>> >
>> >On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com>
>> wrote:
>> >
>> >>Dear All ,
>> >>
>> >>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>> >>defining normal and derived measures and play with it.
>> >>
>> >>         I have defined a new cube to test hierarchial dimensions and
>> >>cube build is failed at Step 3 with following log in kylin.log
>> >>
>> >>         I have run the query which kylin provides on webui of cube on
>> >>hive and it works.
>> >>
>> >>         Please let me know whats going wrong ? Any more info required
>> >>from me please let me know.
>> >>
>> >>
>> >>
>> >>java.lang.NullPointerException
>> >> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>> >>
>> >>
>> >>
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotMana
>> ger.load(Snapsh
>> >>o
>> >>tManager.java:156)] - Loading snapshotTable from
>> >>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f1
>> 9283a.snapshot,
>> >>with loadData: false
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotMana
>> ger.buildSnapsh
>> >>o
>> >>t(SnapshotManager.java:90)] - Identical input FileSignature
>> >>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>> >>lastModifiedTime=1425039202000], reuse existing snapshot at
>> >>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f1
>> 9283a.snapshot
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,031][DEBUG][com.kylinolap.common.persistence.Reso
>> urceStore.putRe
>> >>s
>> >>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>> >>kylin_metadata_qa@hbase<ma...@hbase>)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManage
>> r.reloadAllSour
>> >>c
>> >>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>> >>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManage
>> r.reloadAllSour
>> >>c
>> >>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManage
>> r.reloadAllSour
>> >>c
>> >>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>> >>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManage
>> r.reloadAllSour
>> >>c
>> >>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager
>> .reloadAllCubeD
>> >>e
>> >>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>> >>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManage
>> r.reloadAllCube
>> >>D
>> >>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager
>> .reloadAllInver
>> >>t
>> >>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>> >>from folder
>> >>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManage
>> r.reloadAllInve
>> >>r
>> >>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index
>> Desc(s)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGener
>> atorCLI.process
>> >>S
>> >>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of
>> STORE_DIM
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateD
>> ictionaryJob.ru
>> >>n
>> >>(CreateDictionaryJob.java:55)] -
>> >>java.lang.NullPointerException
>> >> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>> >> at
>> >>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegme
>> nt(DictionaryGe
>> >>n
>> >>eratorCLI.java:60)
>> >> at
>> >>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegme
>> nt(DictionaryGe
>> >>n
>> >>eratorCLI.java:39)
>> >> at
>> >>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(Crea
>> teDictionaryJob
>> >>.
>> >>java:51)
>> >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> >> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
>> >> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
>> >> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>> >> at
>> >>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(
>> SimpleThreadPool.java:
>> >>5
>> >>73)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOu
>> tput.appendOutp
>> >>u
>> >>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,166][DEBUG][com.kylinolap.common.persistence.Reso
>> urceStore.putRe
>> >>s
>> >>ource(ResourceStore.java:166)] - Saving resource
>> >>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>> >>kylin_metadata_qa@hbase<ma...@hbase>)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,174][DEBUG][com.kylinolap.common.persistence.Reso
>> urceStore.putRe
>> >>s
>> >>ource(ResourceStore.java:166)] - Saving resource
>> >>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>> >>kylin_metadata_qa@hbase<ma...@hbase>)
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.
>> execute(JobFlowNod
>> >>e
>> >>.java:87)] - Job status for
>> >>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>> >>updated.
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.
>> execute(JobFlowNod
>> >>e
>> >>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>> >>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_
>> distinct_column
>> >>s
>> >>[QuartzScheduler_Worker-10]:[2015-03-04
>> >>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.
>> execute(JobFlowNod
>> >>e
>> >>.java:89)] - output:Start to execute command:
>> >> -cubename NDim -segmentname FULL_BUILD -input
>> >>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_
>> distinct_column
>> >>s
>> >>Command execute return code 2
>> >>
>> >>
>> >>
>> >>Regards,
>> >>Santosh Akhilesh
>> >>Bangalore R&D
>> >>HUAWEI TECHNOLOGIES CO.,LTD.
>> >>
>> >>www.huawei.com
>> >>----------------------------------------------------------
>> ---------------
>> >>-
>> >>-----------------------------------------------------------
>> >>This e-mail and its attachments contain confidential information from
>> >>HUAWEI, which
>> >>is intended only for the person or entity whose address is listed above.
>> >>Any use of the
>> >>information contained herein in any way (including, but not limited to,
>> >>total or partial
>> >>disclosure, reproduction, or dissemination) by persons other than the
>> >>intended
>> >>recipient(s) is prohibited. If you receive this e-mail in error, please
>> >>notify the sender by
>> >>phone or email immediately and delete it!
>
>

Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by Santosh Akhilesh <sa...@gmail.com>.
Oh looks like it's about last modified time in Json I should put it 0?
I am out of office will give a try tmrw. If something else please let me
know.
On Wed, 4 Mar 2015 at 7:40 pm, Santoshakhilesh <sa...@huawei.com>
wrote:

> Hi Shaofeng ,
>     I have changed the property file to cerate a new htable and everything
> was fresh.
>     Unless there is some restriction for join for hierarchical dimension,
> I think there is a bug in wizard while creating hierarchical dimensions ,
> it just deletes the join condition on the table on which hierarch dimension
> is added.
>
>     I  have tried to use the JSON editor , when I open it gives a sample
> json and not the current json of cube. I tried deleting and pasting the
> JSON which I get from the cube json view but while saving I get the
> following error.
>
>
> Error Message
> Overwriting conflict /cube_desc/HierarchyCube.json, expect old TS
> 1425496007007, but it is 0
>
> I had tried deleting the existing cube and then creating using json editor
> but same error. I also tried removing the UUID part of the json as I
> thought this you might be generating internally but still the error.
>
> So what options I am left with , install 0.7 and give a try ?
>
> I had modified the JSON as below to add the join condition. I also have
> another query why the datatype in json is always null ?
>
> {
>   "uuid": "3721b133-f11a-4ffa-98af-8d8688db3706",
>   "name": "HierarchyCube",
>   "description": "",
>   "dimensions": [
>     {
>       "id": 1,
>       "name": "AREA",
>       "join": {
>         "type": "left",
>         "primary_key": [
>           "STOREID"
>         ],
>         "foreign_key": [
>           "STOREID"
>         ]
>       },
>       "hierarchy": [
>         {
>           "level": "1",
>           "column": "STATE"
>         },
>         {
>           "level": "2",
>           "column": "CITY"
>         }
>       ],
>       "table": "STORE_DIM",
>       "column": "{FK}",
>       "datatype": null,
>       "derived": null
>     },
>     {
>       "id": 2,
>       "name": "SALES_FACT.STOREID",
>       "join": null,
>       "hierarchy": null,
>       "table": "SALES_FACT",
>       "column": "STOREID",
>       "datatype": null,
>       "derived": null
>     },
>     {
>       "id": 3,
>       "name": "SALES_FACT.ITEMID",
>       "join": null,
>       "hierarchy": null,
>       "table": "SALES_FACT",
>       "column": "ITEMID",
>       "datatype": null,
>       "derived": null
>     },
>     {
>       "id": 4,
>       "name": "SALES_FACT.CUSTOMERID",
>       "join": null,
>       "hierarchy": null,
>       "table": "SALES_FACT",
>       "column": "CUSTOMERID",
>       "datatype": null,
>       "derived": null
>     },
>     {
>       "id": 5,
>       "name": "CUSTOMER_DIM_DERIVED",
>       "join": {
>         "type": "left",
>         "primary_key": [
>           "CUSTOMERID"
>         ],
>         "foreign_key": [
>           "CUSTOMERID"
>         ]
>       },
>       "hierarchy": null,
>       "table": "CUSTOMER_DIM",
>       "column": "{FK}",
>       "datatype": null,
>       "derived": [
>         "NAME"
>       ]
>     },
>     {
>       "id": 6,
>       "name": "ITEM_DIM_DERIVED",
>       "join": {
>         "type": "left",
>         "primary_key": [
>           "ITEMID"
>         ],
>         "foreign_key": [
>           "ITEMID"
>         ]
>       },
>       "hierarchy": null,
>       "table": "ITEM_DIM",
>       "column": "{FK}",
>       "datatype": null,
>       "derived": [
>         "BRAND",
>         "COLOR"
>       ]
>     }
>   ],
>   "measures": [
>     {
>       "id": 1,
>       "name": "_COUNT_",
>       "function": {
>         "expression": "COUNT",
>         "parameter": {
>           "type": "constant",
>           "value": "1"
>         },
>         "returntype": "bigint"
>       },
>       "dependent_measure_ref": null
>     },
>     {
>       "id": 2,
>       "name": "TOTALAMOUNT",
>       "function": {
>         "expression": "SUM",
>         "parameter": {
>           "type": "column",
>           "value": "AMOUNT"
>         },
>         "returntype": "double"
>       },
>       "dependent_measure_ref": null
>     },
>     {
>       "id": 3,
>       "name": "TOTALQTY",
>       "function": {
>         "expression": "SUM",
>         "parameter": {
>           "type": "column",
>           "value": "QTY"
>         },
>         "returntype": "int"
>       },
>       "dependent_measure_ref": null
>     }
>   ],
>   "rowkey": {
>     "rowkey_columns": [
>       {
>         "column": "STATE",
>         "length": 0,
>         "dictionary": "true",
>         "mandatory": false
>       },
>       {
>         "column": "CITY",
>         "length": 0,
>         "dictionary": "true",
>         "mandatory": false
>       },
>       {
>         "column": "STOREID",
>         "length": 0,
>         "dictionary": "true",
>         "mandatory": false
>       },
>       {
>         "column": "ITEMID",
>         "length": 0,
>         "dictionary": "true",
>         "mandatory": false
>       },
>       {
>         "column": "CUSTOMERID",
>         "length": 0,
>         "dictionary": "true",
>         "mandatory": false
>       }
>     ],
>     "aggregation_groups": [
>       [
>         "STOREID",
>         "ITEMID",
>         "CUSTOMERID"
>       ],
>       [
>         "STATE",
>         "CITY"
>       ]
>     ]
>   },
>   "signature": "OiAo60gPr38KVois4jHGKw==",
>   "capacity": "MEDIUM",
>   "last_modified": 1425496007007,
>   "fact_table": "SALES_FACT",
>   "null_string": null,
>   "filter_condition": null,
>   "cube_partition_desc": {
>     "partition_date_column": null,
>     "partition_date_start": 0,
>     "cube_partition_type": "APPEND"
>   },
>   "hbase_mapping": {
>     "column_family": [
>       {
>         "name": "F1",
>         "columns": [
>           {
>             "qualifier": "M",
>             "measure_refs": [
>               "_COUNT_",
>               "TOTALAMOUNT",
>               "TOTALQTY"
>             ]
>           }
>         ]
>       }
>     ]
>   },
>   "notify_list": []
> }
>
>
> Regards,
> Santosh Akhilesh
> Bangalore R&D
> HUAWEI TECHNOLOGIES CO.,LTD.
>
> www.huawei.com
> ------------------------------------------------------------
> -------------------------------------------------------------------------
> This e-mail and its attachments contain confidential information from
> HUAWEI, which
> is intended only for the person or entity whose address is listed above.
> Any use of the
> information contained herein in any way (including, but not limited to,
> total or partial
> disclosure, reproduction, or dissemination) by persons other than the
> intended
> recipient(s) is prohibited. If you receive this e-mail in error, please
> notify the sender by
> phone or email immediately and delete it!
>
> ________________________________________
> From: Shi, Shaofeng [shaoshi@ebay.com]
> Sent: Wednesday, March 04, 2015 6:40 PM
> To: dev@kylin.incubator.apache.org
> Cc: Kulbhushan Rana
> Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
> dimension
>
> Hi Santosh, 0.6.5 should be stable, but there is an obvious error in this
> cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this case
> the “join” must be specified, while now it is null; This would cause Kylin
> fail to join fact table with the lookup table. Please try to edit the cube
> to add the join condition, let it looks like other dimensions; If the
> wizard couldn’t work, try to manually add join and then use the “JSON
> Editor” function to update the cube;
>
> From now on we would suggest user to use 0.7.1; The binary package will be
> much easier to install; and there are many enhancements and bug fixes in
> 0.7.1;
>
> To get a fresh metadata store, just use a different htable, this is
> configurable in kylin.properties:
>
> kylin.metadata.url=kylin_metadata_qa@hbase
>
> The default name is kylin_metadata_qa, if change a name, Kylin will get a
> fresh metadata store;
>
>
> On 3/4/15, 8:52 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:
>
> >Hi Shaofeng ,
> >
> >      I had deleted the cube and tried to build again for hierarchy, now
> >it fails in first step itself.
> >      I ahve three dimension tables
> >      a) customer_dim b_ store_dim c) item_dim
> >      I choose left join with fact table while cerating dimensions , but
> >after cube creation left join on sotre_dim is automatically deleted by
> >kylin
> >      store dimension has fields stoireid , city , state , I had tried to
> >add the hierarchy dimension (1) State and (2) City.
> >
> >      I have started facing many issues , even with normal dimensions the
> >kylin query result not matching with hive query.
> >      I had also tried to delete all the metadata in hive and hbase (i
> >had deleted all the entries) and started from beginning by creating a new
> >project but now problem is persisting.
> >
> >      Do you suggest me to install binary distribution is it stable
> >enough now ? If yes how do I make sure all the previous data is deleted.
> >deleting hive and hbase data is enough or I should delete something else
> >too ?
> >
> >      Logs are as below. Sorry for long mail.
> >
> >
> >
> >
> >
> >JSON:
> >{
> >  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
> >  "name": "Hierarchy",
> >  "description": "",
> >  "dimensions": [
> >    {
> >      "id": 1,
> >      "name": "AREA",
> >      "join": null,
> >      "hierarchy": [
> >        {
> >          "level": "1",
> >          "column": "STATE"
> >        },
> >        {
> >          "level": "2",
> >          "column": "CITY"
> >        }
> >      ],
> >      "table": "STORE_DIM",
> >      "column": null,
> >      "datatype": null,
> >      "derived": null
> >    },
> >    {
> >      "id": 2,
> >      "name": "CUSTOMER_DIM_DERIVED",
> >      "join": {
> >        "type": "left",
> >        "primary_key": [
> >          "CUSTOMERID"
> >        ],
> >        "foreign_key": [
> >          "CUSTOMERID"
> >        ]
> >      },
> >      "hierarchy": null,
> >      "table": "CUSTOMER_DIM",
> >      "column": "{FK}",
> >      "datatype": null,
> >      "derived": [
> >        "NAME"
> >      ]
> >    },
> >    {
> >      "id": 3,
> >      "name": "ITEM_DIM_DERIVED",
> >      "join": {
> >        "type": "left",
> >        "primary_key": [
> >          "ITEMID"
> >        ],
> >        "foreign_key": [
> >          "ITEMID"
> >        ]
> >      },
> >      "hierarchy": null,
> >      "table": "ITEM_DIM",
> >      "column": "{FK}",
> >      "datatype": null,
> >      "derived": [
> >        "TYPE",
> >        "BRAND",
> >        "COLOR"
> >      ]
> >    }
> >  ],
> >  "measures": [
> >    {
> >      "id": 1,
> >      "name": "_COUNT_",
> >      "function": {
> >        "expression": "COUNT",
> >        "parameter": {
> >          "type": "constant",
> >          "value": "1"
> >        },
> >        "returntype": "bigint"
> >      },
> >      "dependent_measure_ref": null
> >    },
> >    {
> >      "id": 2,
> >      "name": "TOTALAMOUNT",
> >      "function": {
> >        "expression": "SUM",
> >        "parameter": {
> >          "type": "column",
> >          "value": "AMOUNT"
> >        },
> >        "returntype": "double"
> >      },
> >      "dependent_measure_ref": null
> >    },
> >    {
> >      "id": 3,
> >      "name": "TOTALQTY",
> >      "function": {
> >        "expression": "SUM",
> >        "parameter": {
> >          "type": "column",
> >          "value": "QTY"
> >        },
> >        "returntype": "int"
> >      },
> >      "dependent_measure_ref": null
> >    }
> >  ],
> >  "rowkey": {
> >    "rowkey_columns": [
> >      {
> >        "column": "STATE",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      },
> >      {
> >        "column": "CITY",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      },
> >      {
> >        "column": "CUSTOMERID",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      },
> >      {
> >        "column": "ITEMID",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      }
> >    ],
> >    "aggregation_groups": [
> >      [
> >        "CUSTOMERID",
> >        "ITEMID"
> >      ],
> >      [
> >        "STATE",
> >        "CITY"
> >      ]
> >    ]
> >  },
> >  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
> >  "capacity": "MEDIUM",
> >  "last_modified": 1425492494239,
> >  "fact_table": "SALES_FACT",
> >  "null_string": null,
> >  "filter_condition": null,
> >  "cube_partition_desc": {
> >    "partition_date_column": null,
> >    "partition_date_start": 0,
> >    "cube_partition_type": "APPEND"
> >  },
> >  "hbase_mapping": {
> >    "column_family": [
> >      {
> >        "name": "F1",
> >        "columns": [
> >          {
> >            "qualifier": "M",
> >            "measure_refs": [
> >              "_COUNT_",
> >              "TOTALAMOUNT",
> >              "TOTALQTY"
> >            ]
> >          }
> >        ]
> >      }
> >    ]
> >  },
> >  "notify_list": []
> >}
> >
> >Logs:
> >
> >15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property
> >hive.metastore.local no longer has any effect. Make sure to provide a
> >valid value for hive.metastore.uris if you are connecting to a remote
> >metastore.
> >15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
> >hive.metastore.local does not exist
> >Logging initialized using configuration in
> >jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-
> common-0.14.0.jar!/hive
> >-log4j.properties
> >SLF4J: Class path contains multiple SLF4J bindings.
> >SLF4J: Found binding in
> >[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/
> slf4j-log4j12-1
> >.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >SLF4J: Found binding in
> >[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-
> 0.14.0-standalone
> >.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >explanation.
> >SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> >OK
> >Time taken: 0.578 seconds
> >OK
> >Time taken: 0.444 seconds
> >FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or
> >column reference 'STORE_DIM': (possible column names are:
> >sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
> >sales_fact.qty, sales_fact.amount, customer_dim.customerid,
> >customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
> >item_dim.color)
> >
> >
> >
> >Regards,
> >Santosh Akhilesh
> >Bangalore R&D
> >HUAWEI TECHNOLOGIES CO.,LTD.
> >
> >www.huawei.com
> >-----------------------------------------------------------
> ---------------
> >-----------------------------------------------------------
> >This e-mail and its attachments contain confidential information from
> >HUAWEI, which
> >is intended only for the person or entity whose address is listed above.
> >Any use of the
> >information contained herein in any way (including, but not limited to,
> >total or partial
> >disclosure, reproduction, or dissemination) by persons other than the
> >intended
> >recipient(s) is prohibited. If you receive this e-mail in error, please
> >notify the sender by
> >phone or email immediately and delete it!
> >
> >________________________________________
> >From: Shi, Shaofeng [shaoshi@ebay.com]
> >Sent: Wednesday, March 04, 2015 5:36 PM
> >To: dev@kylin.incubator.apache.org
> >Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
> >dimension
> >
> >It seems that you have a lookup table which doesn¹t define the join
> >relationship; Could you paste the full json of this cube definition?
> >
> >On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com>
> wrote:
> >
> >>Dear All ,
> >>
> >>         I am using 0.6.5 branch of Kylin. I was able to build a cube
> >>defining normal and derived measures and play with it.
> >>
> >>         I have defined a new cube to test hierarchial dimensions and
> >>cube build is failed at Step 3 with following log in kylin.log
> >>
> >>         I have run the query which kylin provides on webui of cube on
> >>hive and it works.
> >>
> >>         Please let me know whats going wrong ? Any more info required
> >>from me please let me know.
> >>
> >>
> >>
> >>java.lang.NullPointerException
> >> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
> >>
> >>
> >>
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,025][INFO][com.kylinolap.dict.lookup.
> SnapshotManager.load(Snapsh
> >>o
> >>tManager.java:156)] - Loading snapshotTable from
> >>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-
> 7e561f19283a.snapshot,
> >>with loadData: false
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,031][INFO][com.kylinolap.dict.lookup.
> SnapshotManager.buildSnapsh
> >>o
> >>t(SnapshotManager.java:90)] - Identical input FileSignature
> >>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
> >>lastModifiedTime=1425039202000], reuse existing snapshot at
> >>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,031][DEBUG][com.kylinolap.common.persistence.
> ResourceStore.putRe
> >>s
> >>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
> >>kylin_metadata_qa@hbase<ma...@hbase>)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,035][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSour
> >>c
> >>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
> >>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,074][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSour
> >>c
> >>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,074][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSour
> >>c
> >>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
> >>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,104][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSour
> >>c
> >>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,104][INFO][com.kylinolap.metadata.
> MetadataManager.reloadAllCubeD
> >>e
> >>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
> >>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,143][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllCube
> >>D
> >>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,143][INFO][com.kylinolap.metadata.
> MetadataManager.reloadAllInver
> >>t
> >>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
> >>from folder
> >>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,147][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllInve
> >>r
> >>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index
> Desc(s)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,158][INFO][com.kylinolap.cube.cli.
> DictionaryGeneratorCLI.process
> >>S
> >>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.
> CreateDictionaryJob.ru
> >>n
> >>(CreateDictionaryJob.java:55)] -
> >>java.lang.NullPointerException
> >> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
> >> at
> >>com.kylinolap.cube.cli.DictionaryGeneratorCLI.
> processSegment(DictionaryGe
> >>n
> >>eratorCLI.java:60)
> >> at
> >>com.kylinolap.cube.cli.DictionaryGeneratorCLI.
> processSegment(DictionaryGe
> >>n
> >>eratorCLI.java:39)
> >> at
> >>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(
> CreateDictionaryJob
> >>.
> >>java:51)
> >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> >> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
> >> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
> >> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >> at
> >>org.quartz.simpl.SimpleThreadPool$WorkerThread.
> run(SimpleThreadPool.java:
> >>5
> >>73)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,159][DEBUG][com.kylinolap.job.cmd.
> JavaHadoopCmdOutput.appendOutp
> >>u
> >>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,166][DEBUG][com.kylinolap.common.persistence.
> ResourceStore.putRe
> >>s
> >>ource(ResourceStore.java:166)] - Saving resource
> >>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
> >>kylin_metadata_qa@hbase<ma...@hbase>)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,174][DEBUG][com.kylinolap.common.persistence.
> ResourceStore.putRe
> >>s
> >>ource(ResourceStore.java:166)] - Saving resource
> >>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
> >>kylin_metadata_qa@hbase<ma...@hbase>)
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,177][INFO][com.kylinolap.job.flow.
> JobFlowNode.execute(JobFlowNod
> >>e
> >>.java:87)] - Job status for
> >>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
> >>updated.
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,177][INFO][com.kylinolap.job.flow.
> JobFlowNode.execute(JobFlowNod
> >>e
> >>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
> >>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/
> fact_distinct_column
> >>s
> >>[QuartzScheduler_Worker-10]:[2015-03-04
> >>19:00:49,177][INFO][com.kylinolap.job.flow.
> JobFlowNode.execute(JobFlowNod
> >>e
> >>.java:89)] - output:Start to execute command:
> >> -cubename NDim -segmentname FULL_BUILD -input
> >>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/
> fact_distinct_column
> >>s
> >>Command execute return code 2
> >>
> >>
> >>
> >>Regards,
> >>Santosh Akhilesh
> >>Bangalore R&D
> >>HUAWEI TECHNOLOGIES CO.,LTD.
> >>
> >>www.huawei.com
> >>----------------------------------------------------------
> ---------------
> >>-
> >>-----------------------------------------------------------
> >>This e-mail and its attachments contain confidential information from
> >>HUAWEI, which
> >>is intended only for the person or entity whose address is listed above.
> >>Any use of the
> >>information contained herein in any way (including, but not limited to,
> >>total or partial
> >>disclosure, reproduction, or dissemination) by persons other than the
> >>intended
> >>recipient(s) is prohibited. If you receive this e-mail in error, please
> >>notify the sender by
> >>phone or email immediately and delete it!

Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by Santosh Akhilesh <sa...@gmail.com>.
Thanks. I am heading to office. Will give a try. I have a internal demo
coming today so I want to stick to 0.6.5 this week. Will install 0.7 after
my demo. I actually did try drop down edit option but it opens a wizard I
could not see edit using Json. I will recheck. Appreciate your help.
On Thu, 5 Mar 2015 at 7:25 am, Shi, Shaofeng <sh...@ebay.com> wrote:

> There are two entries for the JSON editor: one is at the right upper
> corner of the page, which is to create a new cube with the JSON editor;
> The other one is in the Actions drop down list for cube (in the “admins”
> column, only open for admin role), which is for user to edit an existing
> cube; What I mentioned is the second entry;
>
> If you create a new cube, the “last_modified” value need be 0;
>
> The uuid isn’t in use for now I think; The key for a cube is the name;
> Please ensure each cube has a different name and uuid.
>
> We suggest user to upgrade to 0.7.1 now, so that we are at the same code
> level, that would be helpful for troubleshooting;
>
> On 3/4/15, 10:00 PM, "Santoshakhilesh" <sa...@huawei.com>
> wrote:
>
> >Hi Shaofeng ,
> >    I have changed the property file to cerate a new htable and
> >everything was fresh.
> >    Unless there is some restriction for join for hierarchical dimension,
> >I think there is a bug in wizard while creating hierarchical dimensions ,
> >it just deletes the join condition on the table on which hierarch
> >dimension is added.
> >
> >    I  have tried to use the JSON editor , when I open it gives a sample
> >json and not the current json of cube. I tried deleting and pasting the
> >JSON which I get from the cube json view but while saving I get the
> >following error.
> >
> >
> >Error Message
> >Overwriting conflict /cube_desc/HierarchyCube.json, expect old TS
> >1425496007007, but it is 0
> >
> >I had tried deleting the existing cube and then creating using json
> >editor but same error. I also tried removing the UUID part of the json as
> >I thought this you might be generating internally but still the error.
> >
> >So what options I am left with , install 0.7 and give a try ?
> >
> >I had modified the JSON as below to add the join condition. I also have
> >another query why the datatype in json is always null ?
> >
> >{
> >  "uuid": "3721b133-f11a-4ffa-98af-8d8688db3706",
> >  "name": "HierarchyCube",
> >  "description": "",
> >  "dimensions": [
> >    {
> >      "id": 1,
> >      "name": "AREA",
> >      "join": {
> >        "type": "left",
> >        "primary_key": [
> >          "STOREID"
> >        ],
> >        "foreign_key": [
> >          "STOREID"
> >        ]
> >      },
> >      "hierarchy": [
> >        {
> >          "level": "1",
> >          "column": "STATE"
> >        },
> >        {
> >          "level": "2",
> >          "column": "CITY"
> >        }
> >      ],
> >      "table": "STORE_DIM",
> >      "column": "{FK}",
> >      "datatype": null,
> >      "derived": null
> >    },
> >    {
> >      "id": 2,
> >      "name": "SALES_FACT.STOREID",
> >      "join": null,
> >      "hierarchy": null,
> >      "table": "SALES_FACT",
> >      "column": "STOREID",
> >      "datatype": null,
> >      "derived": null
> >    },
> >    {
> >      "id": 3,
> >      "name": "SALES_FACT.ITEMID",
> >      "join": null,
> >      "hierarchy": null,
> >      "table": "SALES_FACT",
> >      "column": "ITEMID",
> >      "datatype": null,
> >      "derived": null
> >    },
> >    {
> >      "id": 4,
> >      "name": "SALES_FACT.CUSTOMERID",
> >      "join": null,
> >      "hierarchy": null,
> >      "table": "SALES_FACT",
> >      "column": "CUSTOMERID",
> >      "datatype": null,
> >      "derived": null
> >    },
> >    {
> >      "id": 5,
> >      "name": "CUSTOMER_DIM_DERIVED",
> >      "join": {
> >        "type": "left",
> >        "primary_key": [
> >          "CUSTOMERID"
> >        ],
> >        "foreign_key": [
> >          "CUSTOMERID"
> >        ]
> >      },
> >      "hierarchy": null,
> >      "table": "CUSTOMER_DIM",
> >      "column": "{FK}",
> >      "datatype": null,
> >      "derived": [
> >        "NAME"
> >      ]
> >    },
> >    {
> >      "id": 6,
> >      "name": "ITEM_DIM_DERIVED",
> >      "join": {
> >        "type": "left",
> >        "primary_key": [
> >          "ITEMID"
> >        ],
> >        "foreign_key": [
> >          "ITEMID"
> >        ]
> >      },
> >      "hierarchy": null,
> >      "table": "ITEM_DIM",
> >      "column": "{FK}",
> >      "datatype": null,
> >      "derived": [
> >        "BRAND",
> >        "COLOR"
> >      ]
> >    }
> >  ],
> >  "measures": [
> >    {
> >      "id": 1,
> >      "name": "_COUNT_",
> >      "function": {
> >        "expression": "COUNT",
> >        "parameter": {
> >          "type": "constant",
> >          "value": "1"
> >        },
> >        "returntype": "bigint"
> >      },
> >      "dependent_measure_ref": null
> >    },
> >    {
> >      "id": 2,
> >      "name": "TOTALAMOUNT",
> >      "function": {
> >        "expression": "SUM",
> >        "parameter": {
> >          "type": "column",
> >          "value": "AMOUNT"
> >        },
> >        "returntype": "double"
> >      },
> >      "dependent_measure_ref": null
> >    },
> >    {
> >      "id": 3,
> >      "name": "TOTALQTY",
> >      "function": {
> >        "expression": "SUM",
> >        "parameter": {
> >          "type": "column",
> >          "value": "QTY"
> >        },
> >        "returntype": "int"
> >      },
> >      "dependent_measure_ref": null
> >    }
> >  ],
> >  "rowkey": {
> >    "rowkey_columns": [
> >      {
> >        "column": "STATE",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      },
> >      {
> >        "column": "CITY",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      },
> >      {
> >        "column": "STOREID",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      },
> >      {
> >        "column": "ITEMID",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      },
> >      {
> >        "column": "CUSTOMERID",
> >        "length": 0,
> >        "dictionary": "true",
> >        "mandatory": false
> >      }
> >    ],
> >    "aggregation_groups": [
> >      [
> >        "STOREID",
> >        "ITEMID",
> >        "CUSTOMERID"
> >      ],
> >      [
> >        "STATE",
> >        "CITY"
> >      ]
> >    ]
> >  },
> >  "signature": "OiAo60gPr38KVois4jHGKw==",
> >  "capacity": "MEDIUM",
> >  "last_modified": 1425496007007,
> >  "fact_table": "SALES_FACT",
> >  "null_string": null,
> >  "filter_condition": null,
> >  "cube_partition_desc": {
> >    "partition_date_column": null,
> >    "partition_date_start": 0,
> >    "cube_partition_type": "APPEND"
> >  },
> >  "hbase_mapping": {
> >    "column_family": [
> >      {
> >        "name": "F1",
> >        "columns": [
> >          {
> >            "qualifier": "M",
> >            "measure_refs": [
> >              "_COUNT_",
> >              "TOTALAMOUNT",
> >              "TOTALQTY"
> >            ]
> >          }
> >        ]
> >      }
> >    ]
> >  },
> >  "notify_list": []
> >}
> >
> >
> >Regards,
> >Santosh Akhilesh
> >Bangalore R&D
> >HUAWEI TECHNOLOGIES CO.,LTD.
> >
> >www.huawei.com
> >-----------------------------------------------------------
> ---------------
> >-----------------------------------------------------------
> >This e-mail and its attachments contain confidential information from
> >HUAWEI, which
> >is intended only for the person or entity whose address is listed above.
> >Any use of the
> >information contained herein in any way (including, but not limited to,
> >total or partial
> >disclosure, reproduction, or dissemination) by persons other than the
> >intended
> >recipient(s) is prohibited. If you receive this e-mail in error, please
> >notify the sender by
> >phone or email immediately and delete it!
> >
> >________________________________________
> >From: Shi, Shaofeng [shaoshi@ebay.com]
> >Sent: Wednesday, March 04, 2015 6:40 PM
> >To: dev@kylin.incubator.apache.org
> >Cc: Kulbhushan Rana
> >Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
> >dimension
> >
> >Hi Santosh, 0.6.5 should be stable, but there is an obvious error in this
> >cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this case
> >the “join” must be specified, while now it is null; This would cause Kylin
> >fail to join fact table with the lookup table. Please try to edit the cube
> >to add the join condition, let it looks like other dimensions; If the
> >wizard couldn’t work, try to manually add join and then use the “JSON
> >Editor” function to update the cube;
> >
> >From now on we would suggest user to use 0.7.1; The binary package will be
> >much easier to install; and there are many enhancements and bug fixes in
> >0.7.1;
> >
> >To get a fresh metadata store, just use a different htable, this is
> >configurable in kylin.properties:
> >
> >kylin.metadata.url=kylin_metadata_qa@hbase
> >
> >The default name is kylin_metadata_qa, if change a name, Kylin will get a
> >fresh metadata store;
> >
> >
> >On 3/4/15, 8:52 PM, "Santoshakhilesh" <sa...@huawei.com>
> wrote:
> >
> >>Hi Shaofeng ,
> >>
> >>      I had deleted the cube and tried to build again for hierarchy, now
> >>it fails in first step itself.
> >>      I ahve three dimension tables
> >>      a) customer_dim b_ store_dim c) item_dim
> >>      I choose left join with fact table while cerating dimensions , but
> >>after cube creation left join on sotre_dim is automatically deleted by
> >>kylin
> >>      store dimension has fields stoireid , city , state , I had tried to
> >>add the hierarchy dimension (1) State and (2) City.
> >>
> >>      I have started facing many issues , even with normal dimensions the
> >>kylin query result not matching with hive query.
> >>      I had also tried to delete all the metadata in hive and hbase (i
> >>had deleted all the entries) and started from beginning by creating a new
> >>project but now problem is persisting.
> >>
> >>      Do you suggest me to install binary distribution is it stable
> >>enough now ? If yes how do I make sure all the previous data is deleted.
> >>deleting hive and hbase data is enough or I should delete something else
> >>too ?
> >>
> >>      Logs are as below. Sorry for long mail.
> >>
> >>
> >>
> >>
> >>
> >>JSON:
> >>{
> >>  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
> >>  "name": "Hierarchy",
> >>  "description": "",
> >>  "dimensions": [
> >>    {
> >>      "id": 1,
> >>      "name": "AREA",
> >>      "join": null,
> >>      "hierarchy": [
> >>        {
> >>          "level": "1",
> >>          "column": "STATE"
> >>        },
> >>        {
> >>          "level": "2",
> >>          "column": "CITY"
> >>        }
> >>      ],
> >>      "table": "STORE_DIM",
> >>      "column": null,
> >>      "datatype": null,
> >>      "derived": null
> >>    },
> >>    {
> >>      "id": 2,
> >>      "name": "CUSTOMER_DIM_DERIVED",
> >>      "join": {
> >>        "type": "left",
> >>        "primary_key": [
> >>          "CUSTOMERID"
> >>        ],
> >>        "foreign_key": [
> >>          "CUSTOMERID"
> >>        ]
> >>      },
> >>      "hierarchy": null,
> >>      "table": "CUSTOMER_DIM",
> >>      "column": "{FK}",
> >>      "datatype": null,
> >>      "derived": [
> >>        "NAME"
> >>      ]
> >>    },
> >>    {
> >>      "id": 3,
> >>      "name": "ITEM_DIM_DERIVED",
> >>      "join": {
> >>        "type": "left",
> >>        "primary_key": [
> >>          "ITEMID"
> >>        ],
> >>        "foreign_key": [
> >>          "ITEMID"
> >>        ]
> >>      },
> >>      "hierarchy": null,
> >>      "table": "ITEM_DIM",
> >>      "column": "{FK}",
> >>      "datatype": null,
> >>      "derived": [
> >>        "TYPE",
> >>        "BRAND",
> >>        "COLOR"
> >>      ]
> >>    }
> >>  ],
> >>  "measures": [
> >>    {
> >>      "id": 1,
> >>      "name": "_COUNT_",
> >>      "function": {
> >>        "expression": "COUNT",
> >>        "parameter": {
> >>          "type": "constant",
> >>          "value": "1"
> >>        },
> >>        "returntype": "bigint"
> >>      },
> >>      "dependent_measure_ref": null
> >>    },
> >>    {
> >>      "id": 2,
> >>      "name": "TOTALAMOUNT",
> >>      "function": {
> >>        "expression": "SUM",
> >>        "parameter": {
> >>          "type": "column",
> >>          "value": "AMOUNT"
> >>        },
> >>        "returntype": "double"
> >>      },
> >>      "dependent_measure_ref": null
> >>    },
> >>    {
> >>      "id": 3,
> >>      "name": "TOTALQTY",
> >>      "function": {
> >>        "expression": "SUM",
> >>        "parameter": {
> >>          "type": "column",
> >>          "value": "QTY"
> >>        },
> >>        "returntype": "int"
> >>      },
> >>      "dependent_measure_ref": null
> >>    }
> >>  ],
> >>  "rowkey": {
> >>    "rowkey_columns": [
> >>      {
> >>        "column": "STATE",
> >>        "length": 0,
> >>        "dictionary": "true",
> >>        "mandatory": false
> >>      },
> >>      {
> >>        "column": "CITY",
> >>        "length": 0,
> >>        "dictionary": "true",
> >>        "mandatory": false
> >>      },
> >>      {
> >>        "column": "CUSTOMERID",
> >>        "length": 0,
> >>        "dictionary": "true",
> >>        "mandatory": false
> >>      },
> >>      {
> >>        "column": "ITEMID",
> >>        "length": 0,
> >>        "dictionary": "true",
> >>        "mandatory": false
> >>      }
> >>    ],
> >>    "aggregation_groups": [
> >>      [
> >>        "CUSTOMERID",
> >>        "ITEMID"
> >>      ],
> >>      [
> >>        "STATE",
> >>        "CITY"
> >>      ]
> >>    ]
> >>  },
> >>  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
> >>  "capacity": "MEDIUM",
> >>  "last_modified": 1425492494239,
> >>  "fact_table": "SALES_FACT",
> >>  "null_string": null,
> >>  "filter_condition": null,
> >>  "cube_partition_desc": {
> >>    "partition_date_column": null,
> >>    "partition_date_start": 0,
> >>    "cube_partition_type": "APPEND"
> >>  },
> >>  "hbase_mapping": {
> >>    "column_family": [
> >>      {
> >>        "name": "F1",
> >>        "columns": [
> >>          {
> >>            "qualifier": "M",
> >>            "measure_refs": [
> >>              "_COUNT_",
> >>              "TOTALAMOUNT",
> >>              "TOTALQTY"
> >>            ]
> >>          }
> >>        ]
> >>      }
> >>    ]
> >>  },
> >>  "notify_list": []
> >>}
> >>
> >>Logs:
> >>
> >>15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property
> >>hive.metastore.local no longer has any effect. Make sure to provide a
> >>valid value for hive.metastore.uris if you are connecting to a remote
> >>metastore.
> >>15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
> >>hive.metastore.local does not exist
> >>Logging initialized using configuration in
> >>jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-
> common-0.14.0.jar!/hiv
> >>e
> >>-log4j.properties
> >>SLF4J: Class path contains multiple SLF4J bindings.
> >>SLF4J: Found binding in
> >>[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/
> common/lib/slf4j-log4j12-
> >>1
> >>.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>SLF4J: Found binding in
> >>[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-
> 0.14.0-standalon
> >>e
> >>.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>explanation.
> >>SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> >>OK
> >>Time taken: 0.578 seconds
> >>OK
> >>Time taken: 0.444 seconds
> >>FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or
> >>column reference 'STORE_DIM': (possible column names are:
> >>sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
> >>sales_fact.qty, sales_fact.amount, customer_dim.customerid,
> >>customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
> >>item_dim.color)
> >>
> >>
> >>
> >>Regards,
> >>Santosh Akhilesh
> >>Bangalore R&D
> >>HUAWEI TECHNOLOGIES CO.,LTD.
> >>
> >>www.huawei.com
> >>----------------------------------------------------------
> ---------------
> >>-
> >>-----------------------------------------------------------
> >>This e-mail and its attachments contain confidential information from
> >>HUAWEI, which
> >>is intended only for the person or entity whose address is listed above.
> >>Any use of the
> >>information contained herein in any way (including, but not limited to,
> >>total or partial
> >>disclosure, reproduction, or dissemination) by persons other than the
> >>intended
> >>recipient(s) is prohibited. If you receive this e-mail in error, please
> >>notify the sender by
> >>phone or email immediately and delete it!
> >>
> >>________________________________________
> >>From: Shi, Shaofeng [shaoshi@ebay.com]
> >>Sent: Wednesday, March 04, 2015 5:36 PM
> >>To: dev@kylin.incubator.apache.org
> >>Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
> >>dimension
> >>
> >>It seems that you have a lookup table which doesn¹t define the join
> >>relationship; Could you paste the full json of this cube definition?
> >>
> >>On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com>
> >>wrote:
> >>
> >>>Dear All ,
> >>>
> >>>         I am using 0.6.5 branch of Kylin. I was able to build a cube
> >>>defining normal and derived measures and play with it.
> >>>
> >>>         I have defined a new cube to test hierarchial dimensions and
> >>>cube build is failed at Step 3 with following log in kylin.log
> >>>
> >>>         I have run the query which kylin provides on webui of cube on
> >>>hive and it works.
> >>>
> >>>         Please let me know whats going wrong ? Any more info required
> >>>from me please let me know.
> >>>
> >>>
> >>>
> >>>java.lang.NullPointerException
> >>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
> >>>
> >>>
> >>>
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,025][INFO][com.kylinolap.dict.lookup.
> SnapshotManager.load(Snaps
> >>>h
> >>>o
> >>>tManager.java:156)] - Loading snapshotTable from
> >>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-
> 7e561f19283a.snapshot
> >>>,
> >>>with loadData: false
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,031][INFO][com.kylinolap.dict.lookup.
> SnapshotManager.buildSnaps
> >>>h
> >>>o
> >>>t(SnapshotManager.java:90)] - Identical input FileSignature
> >>>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
> >>>lastModifiedTime=1425039202000], reuse existing snapshot at
> >>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-
> 7e561f19283a.snapshot
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,031][DEBUG][com.kylinolap.common.persistence.
> ResourceStore.putR
> >>>e
> >>>s
> >>>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
> >>>kylin_metadata_qa@hbase<ma...@hbase>)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,035][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSou
> >>>r
> >>>c
> >>>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
> >>>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,074][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSou
> >>>r
> >>>c
> >>>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,074][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSou
> >>>r
> >>>c
> >>>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
> >>>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,104][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllSou
> >>>r
> >>>c
> >>>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,104][INFO][com.kylinolap.metadata.
> MetadataManager.reloadAllCube
> >>>D
> >>>e
> >>>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
> >>>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,143][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllCub
> >>>e
> >>>D
> >>>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,143][INFO][com.kylinolap.metadata.
> MetadataManager.reloadAllInve
> >>>r
> >>>t
> >>>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
> >>>from folder
> >>>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,147][DEBUG][com.kylinolap.metadata.
> MetadataManager.reloadAllInv
> >>>e
> >>>r
> >>>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index
> >>>Desc(s)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,158][INFO][com.kylinolap.cube.cli.
> DictionaryGeneratorCLI.proces
> >>>s
> >>>S
> >>>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of
> STORE_DIM
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.
> CreateDictionaryJob.r
> >>>u
> >>>n
> >>>(CreateDictionaryJob.java:55)] -
> >>>java.lang.NullPointerException
> >>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
> >>> at
> >>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.
> processSegment(DictionaryG
> >>>e
> >>>n
> >>>eratorCLI.java:60)
> >>> at
> >>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.
> processSegment(DictionaryG
> >>>e
> >>>n
> >>>eratorCLI.java:39)
> >>> at
> >>>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(
> CreateDictionaryJo
> >>>b
> >>>.
> >>>java:51)
> >>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> >>> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
> >>> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
> >>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >>> at
> >>>org.quartz.simpl.SimpleThreadPool$WorkerThread.
> run(SimpleThreadPool.java
> >>>:
> >>>5
> >>>73)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,159][DEBUG][com.kylinolap.job.cmd.
> JavaHadoopCmdOutput.appendOut
> >>>p
> >>>u
> >>>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,166][DEBUG][com.kylinolap.common.persistence.
> ResourceStore.putR
> >>>e
> >>>s
> >>>ource(ResourceStore.java:166)] - Saving resource
> >>>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
> >>>kylin_metadata_qa@hbase<ma...@hbase>)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,174][DEBUG][com.kylinolap.common.persistence.
> ResourceStore.putR
> >>>e
> >>>s
> >>>ource(ResourceStore.java:166)] - Saving resource
> >>>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
> >>>kylin_metadata_qa@hbase<ma...@hbase>)
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,177][INFO][com.kylinolap.job.flow.
> JobFlowNode.execute(JobFlowNo
> >>>d
> >>>e
> >>>.java:87)] - Job status for
> >>>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
> >>>updated.
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,177][INFO][com.kylinolap.job.flow.
> JobFlowNode.execute(JobFlowNo
> >>>d
> >>>e
> >>>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
> >>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/
> fact_distinct_colum
> >>>n
> >>>s
> >>>[QuartzScheduler_Worker-10]:[2015-03-04
> >>>19:00:49,177][INFO][com.kylinolap.job.flow.
> JobFlowNode.execute(JobFlowNo
> >>>d
> >>>e
> >>>.java:89)] - output:Start to execute command:
> >>> -cubename NDim -segmentname FULL_BUILD -input
> >>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/
> fact_distinct_colum
> >>>n
> >>>s
> >>>Command execute return code 2
> >>>
> >>>
> >>>
> >>>Regards,
> >>>Santosh Akhilesh
> >>>Bangalore R&D
> >>>HUAWEI TECHNOLOGIES CO.,LTD.
> >>>
> >>>www.huawei.com
> >>>---------------------------------------------------------
> ---------------
> >>>-
> >>>-
> >>>-----------------------------------------------------------
> >>>This e-mail and its attachments contain confidential information from
> >>>HUAWEI, which
> >>>is intended only for the person or entity whose address is listed above.
> >>>Any use of the
> >>>information contained herein in any way (including, but not limited to,
> >>>total or partial
> >>>disclosure, reproduction, or dissemination) by persons other than the
> >>>intended
> >>>recipient(s) is prohibited. If you receive this e-mail in error, please
> >>>notify the sender by
> >>>phone or email immediately and delete it!
>
>

Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by "Shi, Shaofeng" <sh...@ebay.com>.
There are two entries for the JSON editor: one is at the right upper
corner of the page, which is to create a new cube with the JSON editor;
The other one is in the Actions drop down list for cube (in the “admins”
column, only open for admin role), which is for user to edit an existing
cube; What I mentioned is the second entry;

If you create a new cube, the “last_modified” value need be 0;

The uuid isn’t in use for now I think; The key for a cube is the name;
Please ensure each cube has a different name and uuid.

We suggest user to upgrade to 0.7.1 now, so that we are at the same code
level, that would be helpful for troubleshooting;

On 3/4/15, 10:00 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:

>Hi Shaofeng ,
>    I have changed the property file to cerate a new htable and
>everything was fresh.
>    Unless there is some restriction for join for hierarchical dimension,
>I think there is a bug in wizard while creating hierarchical dimensions ,
>it just deletes the join condition on the table on which hierarch
>dimension is added.
>    
>    I  have tried to use the JSON editor , when I open it gives a sample
>json and not the current json of cube. I tried deleting and pasting the
>JSON which I get from the cube json view but while saving I get the
>following error.
>
>
>Error Message
>Overwriting conflict /cube_desc/HierarchyCube.json, expect old TS
>1425496007007, but it is 0
>
>I had tried deleting the existing cube and then creating using json
>editor but same error. I also tried removing the UUID part of the json as
>I thought this you might be generating internally but still the error.
>
>So what options I am left with , install 0.7 and give a try ?
>
>I had modified the JSON as below to add the join condition. I also have
>another query why the datatype in json is always null ?
>
>{
>  "uuid": "3721b133-f11a-4ffa-98af-8d8688db3706",
>  "name": "HierarchyCube",
>  "description": "",
>  "dimensions": [
>    {
>      "id": 1,
>      "name": "AREA",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "STOREID"
>        ],
>        "foreign_key": [
>          "STOREID"
>        ]
>      },
>      "hierarchy": [
>        {
>          "level": "1",
>          "column": "STATE"
>        },
>        {
>          "level": "2",
>          "column": "CITY"
>        }
>      ],
>      "table": "STORE_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": null
>    },
>    {
>      "id": 2,
>      "name": "SALES_FACT.STOREID",
>      "join": null,
>      "hierarchy": null,
>      "table": "SALES_FACT",
>      "column": "STOREID",
>      "datatype": null,
>      "derived": null
>    },
>    {
>      "id": 3,
>      "name": "SALES_FACT.ITEMID",
>      "join": null,
>      "hierarchy": null,
>      "table": "SALES_FACT",
>      "column": "ITEMID",
>      "datatype": null,
>      "derived": null
>    },
>    {
>      "id": 4,
>      "name": "SALES_FACT.CUSTOMERID",
>      "join": null,
>      "hierarchy": null,
>      "table": "SALES_FACT",
>      "column": "CUSTOMERID",
>      "datatype": null,
>      "derived": null
>    },
>    {
>      "id": 5,
>      "name": "CUSTOMER_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "CUSTOMERID"
>        ],
>        "foreign_key": [
>          "CUSTOMERID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "CUSTOMER_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "NAME"
>      ]
>    },
>    {
>      "id": 6,
>      "name": "ITEM_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "ITEMID"
>        ],
>        "foreign_key": [
>          "ITEMID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "ITEM_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "BRAND",
>        "COLOR"
>      ]
>    }
>  ],
>  "measures": [
>    {
>      "id": 1,
>      "name": "_COUNT_",
>      "function": {
>        "expression": "COUNT",
>        "parameter": {
>          "type": "constant",
>          "value": "1"
>        },
>        "returntype": "bigint"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 2,
>      "name": "TOTALAMOUNT",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "AMOUNT"
>        },
>        "returntype": "double"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 3,
>      "name": "TOTALQTY",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "QTY"
>        },
>        "returntype": "int"
>      },
>      "dependent_measure_ref": null
>    }
>  ],
>  "rowkey": {
>    "rowkey_columns": [
>      {
>        "column": "STATE",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CITY",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "STOREID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "ITEMID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CUSTOMERID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      }
>    ],
>    "aggregation_groups": [
>      [
>        "STOREID",
>        "ITEMID",
>        "CUSTOMERID"
>      ],
>      [
>        "STATE",
>        "CITY"
>      ]
>    ]
>  },
>  "signature": "OiAo60gPr38KVois4jHGKw==",
>  "capacity": "MEDIUM",
>  "last_modified": 1425496007007,
>  "fact_table": "SALES_FACT",
>  "null_string": null,
>  "filter_condition": null,
>  "cube_partition_desc": {
>    "partition_date_column": null,
>    "partition_date_start": 0,
>    "cube_partition_type": "APPEND"
>  },
>  "hbase_mapping": {
>    "column_family": [
>      {
>        "name": "F1",
>        "columns": [
>          {
>            "qualifier": "M",
>            "measure_refs": [
>              "_COUNT_",
>              "TOTALAMOUNT",
>              "TOTALQTY"
>            ]
>          }
>        ]
>      }
>    ]
>  },
>  "notify_list": []
>}
>
>
>Regards,
>Santosh Akhilesh
>Bangalore R&D
>HUAWEI TECHNOLOGIES CO.,LTD.
>
>www.huawei.com
>--------------------------------------------------------------------------
>-----------------------------------------------------------
>This e-mail and its attachments contain confidential information from
>HUAWEI, which
>is intended only for the person or entity whose address is listed above.
>Any use of the
>information contained herein in any way (including, but not limited to,
>total or partial
>disclosure, reproduction, or dissemination) by persons other than the
>intended
>recipient(s) is prohibited. If you receive this e-mail in error, please
>notify the sender by
>phone or email immediately and delete it!
>
>________________________________________
>From: Shi, Shaofeng [shaoshi@ebay.com]
>Sent: Wednesday, March 04, 2015 6:40 PM
>To: dev@kylin.incubator.apache.org
>Cc: Kulbhushan Rana
>Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>dimension
>
>Hi Santosh, 0.6.5 should be stable, but there is an obvious error in this
>cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this case
>the “join” must be specified, while now it is null; This would cause Kylin
>fail to join fact table with the lookup table. Please try to edit the cube
>to add the join condition, let it looks like other dimensions; If the
>wizard couldn’t work, try to manually add join and then use the “JSON
>Editor” function to update the cube;
>
>From now on we would suggest user to use 0.7.1; The binary package will be
>much easier to install; and there are many enhancements and bug fixes in
>0.7.1;
>
>To get a fresh metadata store, just use a different htable, this is
>configurable in kylin.properties:
>
>kylin.metadata.url=kylin_metadata_qa@hbase
>
>The default name is kylin_metadata_qa, if change a name, Kylin will get a
>fresh metadata store;
>
>
>On 3/4/15, 8:52 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:
>
>>Hi Shaofeng ,
>>
>>      I had deleted the cube and tried to build again for hierarchy, now
>>it fails in first step itself.
>>      I ahve three dimension tables
>>      a) customer_dim b_ store_dim c) item_dim
>>      I choose left join with fact table while cerating dimensions , but
>>after cube creation left join on sotre_dim is automatically deleted by
>>kylin
>>      store dimension has fields stoireid , city , state , I had tried to
>>add the hierarchy dimension (1) State and (2) City.
>>
>>      I have started facing many issues , even with normal dimensions the
>>kylin query result not matching with hive query.
>>      I had also tried to delete all the metadata in hive and hbase (i
>>had deleted all the entries) and started from beginning by creating a new
>>project but now problem is persisting.
>>
>>      Do you suggest me to install binary distribution is it stable
>>enough now ? If yes how do I make sure all the previous data is deleted.
>>deleting hive and hbase data is enough or I should delete something else
>>too ?
>>
>>      Logs are as below. Sorry for long mail.
>>
>>
>>
>>
>>
>>JSON:
>>{
>>  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
>>  "name": "Hierarchy",
>>  "description": "",
>>  "dimensions": [
>>    {
>>      "id": 1,
>>      "name": "AREA",
>>      "join": null,
>>      "hierarchy": [
>>        {
>>          "level": "1",
>>          "column": "STATE"
>>        },
>>        {
>>          "level": "2",
>>          "column": "CITY"
>>        }
>>      ],
>>      "table": "STORE_DIM",
>>      "column": null,
>>      "datatype": null,
>>      "derived": null
>>    },
>>    {
>>      "id": 2,
>>      "name": "CUSTOMER_DIM_DERIVED",
>>      "join": {
>>        "type": "left",
>>        "primary_key": [
>>          "CUSTOMERID"
>>        ],
>>        "foreign_key": [
>>          "CUSTOMERID"
>>        ]
>>      },
>>      "hierarchy": null,
>>      "table": "CUSTOMER_DIM",
>>      "column": "{FK}",
>>      "datatype": null,
>>      "derived": [
>>        "NAME"
>>      ]
>>    },
>>    {
>>      "id": 3,
>>      "name": "ITEM_DIM_DERIVED",
>>      "join": {
>>        "type": "left",
>>        "primary_key": [
>>          "ITEMID"
>>        ],
>>        "foreign_key": [
>>          "ITEMID"
>>        ]
>>      },
>>      "hierarchy": null,
>>      "table": "ITEM_DIM",
>>      "column": "{FK}",
>>      "datatype": null,
>>      "derived": [
>>        "TYPE",
>>        "BRAND",
>>        "COLOR"
>>      ]
>>    }
>>  ],
>>  "measures": [
>>    {
>>      "id": 1,
>>      "name": "_COUNT_",
>>      "function": {
>>        "expression": "COUNT",
>>        "parameter": {
>>          "type": "constant",
>>          "value": "1"
>>        },
>>        "returntype": "bigint"
>>      },
>>      "dependent_measure_ref": null
>>    },
>>    {
>>      "id": 2,
>>      "name": "TOTALAMOUNT",
>>      "function": {
>>        "expression": "SUM",
>>        "parameter": {
>>          "type": "column",
>>          "value": "AMOUNT"
>>        },
>>        "returntype": "double"
>>      },
>>      "dependent_measure_ref": null
>>    },
>>    {
>>      "id": 3,
>>      "name": "TOTALQTY",
>>      "function": {
>>        "expression": "SUM",
>>        "parameter": {
>>          "type": "column",
>>          "value": "QTY"
>>        },
>>        "returntype": "int"
>>      },
>>      "dependent_measure_ref": null
>>    }
>>  ],
>>  "rowkey": {
>>    "rowkey_columns": [
>>      {
>>        "column": "STATE",
>>        "length": 0,
>>        "dictionary": "true",
>>        "mandatory": false
>>      },
>>      {
>>        "column": "CITY",
>>        "length": 0,
>>        "dictionary": "true",
>>        "mandatory": false
>>      },
>>      {
>>        "column": "CUSTOMERID",
>>        "length": 0,
>>        "dictionary": "true",
>>        "mandatory": false
>>      },
>>      {
>>        "column": "ITEMID",
>>        "length": 0,
>>        "dictionary": "true",
>>        "mandatory": false
>>      }
>>    ],
>>    "aggregation_groups": [
>>      [
>>        "CUSTOMERID",
>>        "ITEMID"
>>      ],
>>      [
>>        "STATE",
>>        "CITY"
>>      ]
>>    ]
>>  },
>>  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
>>  "capacity": "MEDIUM",
>>  "last_modified": 1425492494239,
>>  "fact_table": "SALES_FACT",
>>  "null_string": null,
>>  "filter_condition": null,
>>  "cube_partition_desc": {
>>    "partition_date_column": null,
>>    "partition_date_start": 0,
>>    "cube_partition_type": "APPEND"
>>  },
>>  "hbase_mapping": {
>>    "column_family": [
>>      {
>>        "name": "F1",
>>        "columns": [
>>          {
>>            "qualifier": "M",
>>            "measure_refs": [
>>              "_COUNT_",
>>              "TOTALAMOUNT",
>>              "TOTALQTY"
>>            ]
>>          }
>>        ]
>>      }
>>    ]
>>  },
>>  "notify_list": []
>>}
>>
>>Logs:
>>
>>15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property
>>hive.metastore.local no longer has any effect. Make sure to provide a
>>valid value for hive.metastore.uris if you are connecting to a remote
>>metastore.
>>15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
>>hive.metastore.local does not exist
>>Logging initialized using configuration in
>>jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-common-0.14.0.jar!/hiv
>>e
>>-log4j.properties
>>SLF4J: Class path contains multiple SLF4J bindings.
>>SLF4J: Found binding in
>>[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-
>>1
>>.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>SLF4J: Found binding in
>>[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalon
>>e
>>.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>explanation.
>>SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>OK
>>Time taken: 0.578 seconds
>>OK
>>Time taken: 0.444 seconds
>>FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or
>>column reference 'STORE_DIM': (possible column names are:
>>sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
>>sales_fact.qty, sales_fact.amount, customer_dim.customerid,
>>customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
>>item_dim.color)
>>
>>
>>
>>Regards,
>>Santosh Akhilesh
>>Bangalore R&D
>>HUAWEI TECHNOLOGIES CO.,LTD.
>>
>>www.huawei.com
>>-------------------------------------------------------------------------
>>-
>>-----------------------------------------------------------
>>This e-mail and its attachments contain confidential information from
>>HUAWEI, which
>>is intended only for the person or entity whose address is listed above.
>>Any use of the
>>information contained herein in any way (including, but not limited to,
>>total or partial
>>disclosure, reproduction, or dissemination) by persons other than the
>>intended
>>recipient(s) is prohibited. If you receive this e-mail in error, please
>>notify the sender by
>>phone or email immediately and delete it!
>>
>>________________________________________
>>From: Shi, Shaofeng [shaoshi@ebay.com]
>>Sent: Wednesday, March 04, 2015 5:36 PM
>>To: dev@kylin.incubator.apache.org
>>Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>>dimension
>>
>>It seems that you have a lookup table which doesn¹t define the join
>>relationship; Could you paste the full json of this cube definition?
>>
>>On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com>
>>wrote:
>>
>>>Dear All ,
>>>
>>>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>>>defining normal and derived measures and play with it.
>>>
>>>         I have defined a new cube to test hierarchial dimensions and
>>>cube build is failed at Step 3 with following log in kylin.log
>>>
>>>         I have run the query which kylin provides on webui of cube on
>>>hive and it works.
>>>
>>>         Please let me know whats going wrong ? Any more info required
>>>from me please let me know.
>>>
>>>
>>>
>>>java.lang.NullPointerException
>>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>>>
>>>
>>>
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(Snaps
>>>h
>>>o
>>>tManager.java:156)] - Loading snapshotTable from
>>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>>>,
>>>with loadData: false
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnaps
>>>h
>>>o
>>>t(SnapshotManager.java:90)] - Identical input FileSignature
>>>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>>>lastModifiedTime=1425039202000], reuse existing snapshot at
>>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putR
>>>e
>>>s
>>>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>>>kylin_metadata_qa@hbase<ma...@hbase>)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSou
>>>r
>>>c
>>>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>>>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSou
>>>r
>>>c
>>>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSou
>>>r
>>>c
>>>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>>>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSou
>>>r
>>>c
>>>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCube
>>>D
>>>e
>>>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>>>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCub
>>>e
>>>D
>>>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInve
>>>r
>>>t
>>>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>>>from folder
>>>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInv
>>>e
>>>r
>>>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index
>>>Desc(s)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.proces
>>>s
>>>S
>>>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.r
>>>u
>>>n
>>>(CreateDictionaryJob.java:55)] -
>>>java.lang.NullPointerException
>>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>>> at
>>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryG
>>>e
>>>n
>>>eratorCLI.java:60)
>>> at
>>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryG
>>>e
>>>n
>>>eratorCLI.java:39)
>>> at
>>>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJo
>>>b
>>>.
>>>java:51)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
>>> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>>> at
>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java
>>>:
>>>5
>>>73)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOut
>>>p
>>>u
>>>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putR
>>>e
>>>s
>>>ource(ResourceStore.java:166)] - Saving resource
>>>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>>>kylin_metadata_qa@hbase<ma...@hbase>)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putR
>>>e
>>>s
>>>ource(ResourceStore.java:166)] - Saving resource
>>>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>>>kylin_metadata_qa@hbase<ma...@hbase>)
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNo
>>>d
>>>e
>>>.java:87)] - Job status for
>>>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>>>updated.
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNo
>>>d
>>>e
>>>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_colum
>>>n
>>>s
>>>[QuartzScheduler_Worker-10]:[2015-03-04
>>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNo
>>>d
>>>e
>>>.java:89)] - output:Start to execute command:
>>> -cubename NDim -segmentname FULL_BUILD -input
>>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_colum
>>>n
>>>s
>>>Command execute return code 2
>>>
>>>
>>>
>>>Regards,
>>>Santosh Akhilesh
>>>Bangalore R&D
>>>HUAWEI TECHNOLOGIES CO.,LTD.
>>>
>>>www.huawei.com
>>>------------------------------------------------------------------------
>>>-
>>>-
>>>-----------------------------------------------------------
>>>This e-mail and its attachments contain confidential information from
>>>HUAWEI, which
>>>is intended only for the person or entity whose address is listed above.
>>>Any use of the
>>>information contained herein in any way (including, but not limited to,
>>>total or partial
>>>disclosure, reproduction, or dissemination) by persons other than the
>>>intended
>>>recipient(s) is prohibited. If you receive this e-mail in error, please
>>>notify the sender by
>>>phone or email immediately and delete it!


RE: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by Santoshakhilesh <sa...@huawei.com>.
Hi Shaofeng ,
    I have changed the property file to cerate a new htable and everything was fresh.
    Unless there is some restriction for join for hierarchical dimension,
I think there is a bug in wizard while creating hierarchical dimensions , it just deletes the join condition on the table on which hierarch dimension is added.
    
    I  have tried to use the JSON editor , when I open it gives a sample json and not the current json of cube. I tried deleting and pasting the JSON which I get from the cube json view but while saving I get the following error.


Error Message
Overwriting conflict /cube_desc/HierarchyCube.json, expect old TS 1425496007007, but it is 0

I had tried deleting the existing cube and then creating using json editor but same error. I also tried removing the UUID part of the json as I thought this you might be generating internally but still the error.

So what options I am left with , install 0.7 and give a try ?

I had modified the JSON as below to add the join condition. I also have another query why the datatype in json is always null ?

{
  "uuid": "3721b133-f11a-4ffa-98af-8d8688db3706",
  "name": "HierarchyCube",
  "description": "",
  "dimensions": [
    {
      "id": 1,
      "name": "AREA",
      "join": {
        "type": "left",
        "primary_key": [
          "STOREID"
        ],
        "foreign_key": [
          "STOREID"
        ]
      },
      "hierarchy": [
        {
          "level": "1",
          "column": "STATE"
        },
        {
          "level": "2",
          "column": "CITY"
        }
      ],
      "table": "STORE_DIM",
      "column": "{FK}",
      "datatype": null,
      "derived": null
    },
    {
      "id": 2,
      "name": "SALES_FACT.STOREID",
      "join": null,
      "hierarchy": null,
      "table": "SALES_FACT",
      "column": "STOREID",
      "datatype": null,
      "derived": null
    },
    {
      "id": 3,
      "name": "SALES_FACT.ITEMID",
      "join": null,
      "hierarchy": null,
      "table": "SALES_FACT",
      "column": "ITEMID",
      "datatype": null,
      "derived": null
    },
    {
      "id": 4,
      "name": "SALES_FACT.CUSTOMERID",
      "join": null,
      "hierarchy": null,
      "table": "SALES_FACT",
      "column": "CUSTOMERID",
      "datatype": null,
      "derived": null
    },
    {
      "id": 5,
      "name": "CUSTOMER_DIM_DERIVED",
      "join": {
        "type": "left",
        "primary_key": [
          "CUSTOMERID"
        ],
        "foreign_key": [
          "CUSTOMERID"
        ]
      },
      "hierarchy": null,
      "table": "CUSTOMER_DIM",
      "column": "{FK}",
      "datatype": null,
      "derived": [
        "NAME"
      ]
    },
    {
      "id": 6,
      "name": "ITEM_DIM_DERIVED",
      "join": {
        "type": "left",
        "primary_key": [
          "ITEMID"
        ],
        "foreign_key": [
          "ITEMID"
        ]
      },
      "hierarchy": null,
      "table": "ITEM_DIM",
      "column": "{FK}",
      "datatype": null,
      "derived": [
        "BRAND",
        "COLOR"
      ]
    }
  ],
  "measures": [
    {
      "id": 1,
      "name": "_COUNT_",
      "function": {
        "expression": "COUNT",
        "parameter": {
          "type": "constant",
          "value": "1"
        },
        "returntype": "bigint"
      },
      "dependent_measure_ref": null
    },
    {
      "id": 2,
      "name": "TOTALAMOUNT",
      "function": {
        "expression": "SUM",
        "parameter": {
          "type": "column",
          "value": "AMOUNT"
        },
        "returntype": "double"
      },
      "dependent_measure_ref": null
    },
    {
      "id": 3,
      "name": "TOTALQTY",
      "function": {
        "expression": "SUM",
        "parameter": {
          "type": "column",
          "value": "QTY"
        },
        "returntype": "int"
      },
      "dependent_measure_ref": null
    }
  ],
  "rowkey": {
    "rowkey_columns": [
      {
        "column": "STATE",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      },
      {
        "column": "CITY",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      },
      {
        "column": "STOREID",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      },
      {
        "column": "ITEMID",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      },
      {
        "column": "CUSTOMERID",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      }
    ],
    "aggregation_groups": [
      [
        "STOREID",
        "ITEMID",
        "CUSTOMERID"
      ],
      [
        "STATE",
        "CITY"
      ]
    ]
  },
  "signature": "OiAo60gPr38KVois4jHGKw==",
  "capacity": "MEDIUM",
  "last_modified": 1425496007007,
  "fact_table": "SALES_FACT",
  "null_string": null,
  "filter_condition": null,
  "cube_partition_desc": {
    "partition_date_column": null,
    "partition_date_start": 0,
    "cube_partition_type": "APPEND"
  },
  "hbase_mapping": {
    "column_family": [
      {
        "name": "F1",
        "columns": [
          {
            "qualifier": "M",
            "measure_refs": [
              "_COUNT_",
              "TOTALAMOUNT",
              "TOTALQTY"
            ]
          }
        ]
      }
    ]
  },
  "notify_list": []
}


Regards,
Santosh Akhilesh
Bangalore R&D
HUAWEI TECHNOLOGIES CO.,LTD.

www.huawei.com
-------------------------------------------------------------------------------------------------------------------------------------
This e-mail and its attachments contain confidential information from HUAWEI, which
is intended only for the person or entity whose address is listed above. Any use of the
information contained herein in any way (including, but not limited to, total or partial
disclosure, reproduction, or dissemination) by persons other than the intended
recipient(s) is prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!

________________________________________
From: Shi, Shaofeng [shaoshi@ebay.com]
Sent: Wednesday, March 04, 2015 6:40 PM
To: dev@kylin.incubator.apache.org
Cc: Kulbhushan Rana
Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Hi Santosh, 0.6.5 should be stable, but there is an obvious error in this
cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this case
the “join” must be specified, while now it is null; This would cause Kylin
fail to join fact table with the lookup table. Please try to edit the cube
to add the join condition, let it looks like other dimensions; If the
wizard couldn’t work, try to manually add join and then use the “JSON
Editor” function to update the cube;

>From now on we would suggest user to use 0.7.1; The binary package will be
much easier to install; and there are many enhancements and bug fixes in
0.7.1;

To get a fresh metadata store, just use a different htable, this is
configurable in kylin.properties:

kylin.metadata.url=kylin_metadata_qa@hbase

The default name is kylin_metadata_qa, if change a name, Kylin will get a
fresh metadata store;


On 3/4/15, 8:52 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:

>Hi Shaofeng ,
>
>      I had deleted the cube and tried to build again for hierarchy, now
>it fails in first step itself.
>      I ahve three dimension tables
>      a) customer_dim b_ store_dim c) item_dim
>      I choose left join with fact table while cerating dimensions , but
>after cube creation left join on sotre_dim is automatically deleted by
>kylin
>      store dimension has fields stoireid , city , state , I had tried to
>add the hierarchy dimension (1) State and (2) City.
>
>      I have started facing many issues , even with normal dimensions the
>kylin query result not matching with hive query.
>      I had also tried to delete all the metadata in hive and hbase (i
>had deleted all the entries) and started from beginning by creating a new
>project but now problem is persisting.
>
>      Do you suggest me to install binary distribution is it stable
>enough now ? If yes how do I make sure all the previous data is deleted.
>deleting hive and hbase data is enough or I should delete something else
>too ?
>
>      Logs are as below. Sorry for long mail.
>
>
>
>
>
>JSON:
>{
>  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
>  "name": "Hierarchy",
>  "description": "",
>  "dimensions": [
>    {
>      "id": 1,
>      "name": "AREA",
>      "join": null,
>      "hierarchy": [
>        {
>          "level": "1",
>          "column": "STATE"
>        },
>        {
>          "level": "2",
>          "column": "CITY"
>        }
>      ],
>      "table": "STORE_DIM",
>      "column": null,
>      "datatype": null,
>      "derived": null
>    },
>    {
>      "id": 2,
>      "name": "CUSTOMER_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "CUSTOMERID"
>        ],
>        "foreign_key": [
>          "CUSTOMERID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "CUSTOMER_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "NAME"
>      ]
>    },
>    {
>      "id": 3,
>      "name": "ITEM_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "ITEMID"
>        ],
>        "foreign_key": [
>          "ITEMID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "ITEM_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "TYPE",
>        "BRAND",
>        "COLOR"
>      ]
>    }
>  ],
>  "measures": [
>    {
>      "id": 1,
>      "name": "_COUNT_",
>      "function": {
>        "expression": "COUNT",
>        "parameter": {
>          "type": "constant",
>          "value": "1"
>        },
>        "returntype": "bigint"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 2,
>      "name": "TOTALAMOUNT",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "AMOUNT"
>        },
>        "returntype": "double"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 3,
>      "name": "TOTALQTY",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "QTY"
>        },
>        "returntype": "int"
>      },
>      "dependent_measure_ref": null
>    }
>  ],
>  "rowkey": {
>    "rowkey_columns": [
>      {
>        "column": "STATE",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CITY",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CUSTOMERID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "ITEMID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      }
>    ],
>    "aggregation_groups": [
>      [
>        "CUSTOMERID",
>        "ITEMID"
>      ],
>      [
>        "STATE",
>        "CITY"
>      ]
>    ]
>  },
>  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
>  "capacity": "MEDIUM",
>  "last_modified": 1425492494239,
>  "fact_table": "SALES_FACT",
>  "null_string": null,
>  "filter_condition": null,
>  "cube_partition_desc": {
>    "partition_date_column": null,
>    "partition_date_start": 0,
>    "cube_partition_type": "APPEND"
>  },
>  "hbase_mapping": {
>    "column_family": [
>      {
>        "name": "F1",
>        "columns": [
>          {
>            "qualifier": "M",
>            "measure_refs": [
>              "_COUNT_",
>              "TOTALAMOUNT",
>              "TOTALQTY"
>            ]
>          }
>        ]
>      }
>    ]
>  },
>  "notify_list": []
>}
>
>Logs:
>
>15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property
>hive.metastore.local no longer has any effect. Make sure to provide a
>valid value for hive.metastore.uris if you are connecting to a remote
>metastore.
>15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
>hive.metastore.local does not exist
>Logging initialized using configuration in
>jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-common-0.14.0.jar!/hive
>-log4j.properties
>SLF4J: Class path contains multiple SLF4J bindings.
>SLF4J: Found binding in
>[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1
>.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>SLF4J: Found binding in
>[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalone
>.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>explanation.
>SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>OK
>Time taken: 0.578 seconds
>OK
>Time taken: 0.444 seconds
>FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or
>column reference 'STORE_DIM': (possible column names are:
>sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
>sales_fact.qty, sales_fact.amount, customer_dim.customerid,
>customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
>item_dim.color)
>
>
>
>Regards,
>Santosh Akhilesh
>Bangalore R&D
>HUAWEI TECHNOLOGIES CO.,LTD.
>
>www.huawei.com
>--------------------------------------------------------------------------
>-----------------------------------------------------------
>This e-mail and its attachments contain confidential information from
>HUAWEI, which
>is intended only for the person or entity whose address is listed above.
>Any use of the
>information contained herein in any way (including, but not limited to,
>total or partial
>disclosure, reproduction, or dissemination) by persons other than the
>intended
>recipient(s) is prohibited. If you receive this e-mail in error, please
>notify the sender by
>phone or email immediately and delete it!
>
>________________________________________
>From: Shi, Shaofeng [shaoshi@ebay.com]
>Sent: Wednesday, March 04, 2015 5:36 PM
>To: dev@kylin.incubator.apache.org
>Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>dimension
>
>It seems that you have a lookup table which doesn¹t define the join
>relationship; Could you paste the full json of this cube definition?
>
>On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:
>
>>Dear All ,
>>
>>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>>defining normal and derived measures and play with it.
>>
>>         I have defined a new cube to test hierarchial dimensions and
>>cube build is failed at Step 3 with following log in kylin.log
>>
>>         I have run the query which kylin provides on webui of cube on
>>hive and it works.
>>
>>         Please let me know whats going wrong ? Any more info required
>>from me please let me know.
>>
>>
>>
>>java.lang.NullPointerException
>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>>
>>
>>
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(Snapsh
>>o
>>tManager.java:156)] - Loading snapshotTable from
>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot,
>>with loadData: false
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnapsh
>>o
>>t(SnapshotManager.java:90)] - Identical input FileSignature
>>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>>lastModifiedTime=1425039202000], reuse existing snapshot at
>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>>kylin_metadata_qa@hbase<ma...@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCubeD
>>e
>>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCube
>>D
>>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInver
>>t
>>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>>from folder
>>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInve
>>r
>>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index Desc(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.process
>>S
>>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.ru
>>n
>>(CreateDictionaryJob.java:55)] -
>>java.lang.NullPointerException
>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>> at
>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGe
>>n
>>eratorCLI.java:60)
>> at
>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGe
>>n
>>eratorCLI.java:39)
>> at
>>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob
>>.
>>java:51)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
>> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>> at
>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:
>>5
>>73)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutp
>>u
>>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource
>>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>>kylin_metadata_qa@hbase<ma...@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource
>>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>>kylin_metadata_qa@hbase<ma...@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:87)] - Job status for
>>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>>updated.
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_column
>>s
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:89)] - output:Start to execute command:
>> -cubename NDim -segmentname FULL_BUILD -input
>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_column
>>s
>>Command execute return code 2
>>
>>
>>
>>Regards,
>>Santosh Akhilesh
>>Bangalore R&D
>>HUAWEI TECHNOLOGIES CO.,LTD.
>>
>>www.huawei.com
>>-------------------------------------------------------------------------
>>-
>>-----------------------------------------------------------
>>This e-mail and its attachments contain confidential information from
>>HUAWEI, which
>>is intended only for the person or entity whose address is listed above.
>>Any use of the
>>information contained herein in any way (including, but not limited to,
>>total or partial
>>disclosure, reproduction, or dissemination) by persons other than the
>>intended
>>recipient(s) is prohibited. If you receive this e-mail in error, please
>>notify the sender by
>>phone or email immediately and delete it!

Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by "Shi, Shaofeng" <sh...@ebay.com>.
Hi Santosh, 0.6.5 should be stable, but there is an obvious error in this
cube JSON: the dimension 1 is from lookup table “STORE_DIM”, in this case
the “join” must be specified, while now it is null; This would cause Kylin
fail to join fact table with the lookup table. Please try to edit the cube
to add the join condition, let it looks like other dimensions; If the
wizard couldn’t work, try to manually add join and then use the “JSON
Editor” function to update the cube;

From now on we would suggest user to use 0.7.1; The binary package will be
much easier to install; and there are many enhancements and bug fixes in
0.7.1;

To get a fresh metadata store, just use a different htable, this is
configurable in kylin.properties:

kylin.metadata.url=kylin_metadata_qa@hbase

The default name is kylin_metadata_qa, if change a name, Kylin will get a
fresh metadata store;


On 3/4/15, 8:52 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:

>Hi Shaofeng ,
>      
>      I had deleted the cube and tried to build again for hierarchy, now
>it fails in first step itself.
>      I ahve three dimension tables
>      a) customer_dim b_ store_dim c) item_dim
>      I choose left join with fact table while cerating dimensions , but
>after cube creation left join on sotre_dim is automatically deleted by
>kylin
>      store dimension has fields stoireid , city , state , I had tried to
>add the hierarchy dimension (1) State and (2) City.
>
>      I have started facing many issues , even with normal dimensions the
>kylin query result not matching with hive query.
>      I had also tried to delete all the metadata in hive and hbase (i
>had deleted all the entries) and started from beginning by creating a new
>project but now problem is persisting.
>
>      Do you suggest me to install binary distribution is it stable
>enough now ? If yes how do I make sure all the previous data is deleted.
>deleting hive and hbase data is enough or I should delete something else
>too ?
>
>      Logs are as below. Sorry for long mail.
>      
>       
>
>
>
>JSON:
>{
>  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
>  "name": "Hierarchy",
>  "description": "",
>  "dimensions": [
>    {
>      "id": 1,
>      "name": "AREA",
>      "join": null,
>      "hierarchy": [
>        {
>          "level": "1",
>          "column": "STATE"
>        },
>        {
>          "level": "2",
>          "column": "CITY"
>        }
>      ],
>      "table": "STORE_DIM",
>      "column": null,
>      "datatype": null,
>      "derived": null
>    },
>    {
>      "id": 2,
>      "name": "CUSTOMER_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "CUSTOMERID"
>        ],
>        "foreign_key": [
>          "CUSTOMERID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "CUSTOMER_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "NAME"
>      ]
>    },
>    {
>      "id": 3,
>      "name": "ITEM_DIM_DERIVED",
>      "join": {
>        "type": "left",
>        "primary_key": [
>          "ITEMID"
>        ],
>        "foreign_key": [
>          "ITEMID"
>        ]
>      },
>      "hierarchy": null,
>      "table": "ITEM_DIM",
>      "column": "{FK}",
>      "datatype": null,
>      "derived": [
>        "TYPE",
>        "BRAND",
>        "COLOR"
>      ]
>    }
>  ],
>  "measures": [
>    {
>      "id": 1,
>      "name": "_COUNT_",
>      "function": {
>        "expression": "COUNT",
>        "parameter": {
>          "type": "constant",
>          "value": "1"
>        },
>        "returntype": "bigint"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 2,
>      "name": "TOTALAMOUNT",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "AMOUNT"
>        },
>        "returntype": "double"
>      },
>      "dependent_measure_ref": null
>    },
>    {
>      "id": 3,
>      "name": "TOTALQTY",
>      "function": {
>        "expression": "SUM",
>        "parameter": {
>          "type": "column",
>          "value": "QTY"
>        },
>        "returntype": "int"
>      },
>      "dependent_measure_ref": null
>    }
>  ],
>  "rowkey": {
>    "rowkey_columns": [
>      {
>        "column": "STATE",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CITY",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "CUSTOMERID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      },
>      {
>        "column": "ITEMID",
>        "length": 0,
>        "dictionary": "true",
>        "mandatory": false
>      }
>    ],
>    "aggregation_groups": [
>      [
>        "CUSTOMERID",
>        "ITEMID"
>      ],
>      [
>        "STATE",
>        "CITY"
>      ]
>    ]
>  },
>  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
>  "capacity": "MEDIUM",
>  "last_modified": 1425492494239,
>  "fact_table": "SALES_FACT",
>  "null_string": null,
>  "filter_condition": null,
>  "cube_partition_desc": {
>    "partition_date_column": null,
>    "partition_date_start": 0,
>    "cube_partition_type": "APPEND"
>  },
>  "hbase_mapping": {
>    "column_family": [
>      {
>        "name": "F1",
>        "columns": [
>          {
>            "qualifier": "M",
>            "measure_refs": [
>              "_COUNT_",
>              "TOTALAMOUNT",
>              "TOTALQTY"
>            ]
>          }
>        ]
>      }
>    ]
>  },
>  "notify_list": []
>}
>
>Logs:
>
>15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property
>hive.metastore.local no longer has any effect. Make sure to provide a
>valid value for hive.metastore.uris if you are connecting to a remote
>metastore.
>15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name
>hive.metastore.local does not exist
>Logging initialized using configuration in
>jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-common-0.14.0.jar!/hive
>-log4j.properties
>SLF4J: Class path contains multiple SLF4J bindings.
>SLF4J: Found binding in
>[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1
>.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>SLF4J: Found binding in
>[jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalone
>.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>explanation.
>SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>OK
>Time taken: 0.578 seconds
>OK
>Time taken: 0.444 seconds
>FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or
>column reference 'STORE_DIM': (possible column names are:
>sales_fact.storeid, sales_fact.itemid, sales_fact.customerid,
>sales_fact.qty, sales_fact.amount, customer_dim.customerid,
>customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand,
>item_dim.color)
>
>
>
>Regards,
>Santosh Akhilesh
>Bangalore R&D
>HUAWEI TECHNOLOGIES CO.,LTD.
>
>www.huawei.com
>--------------------------------------------------------------------------
>-----------------------------------------------------------
>This e-mail and its attachments contain confidential information from
>HUAWEI, which
>is intended only for the person or entity whose address is listed above.
>Any use of the
>information contained herein in any way (including, but not limited to,
>total or partial
>disclosure, reproduction, or dissemination) by persons other than the
>intended
>recipient(s) is prohibited. If you receive this e-mail in error, please
>notify the sender by
>phone or email immediately and delete it!
>
>________________________________________
>From: Shi, Shaofeng [shaoshi@ebay.com]
>Sent: Wednesday, March 04, 2015 5:36 PM
>To: dev@kylin.incubator.apache.org
>Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial
>dimension
>
>It seems that you have a lookup table which doesn¹t define the join
>relationship; Could you paste the full json of this cube definition?
>
>On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:
>
>>Dear All ,
>>
>>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>>defining normal and derived measures and play with it.
>>
>>         I have defined a new cube to test hierarchial dimensions and
>>cube build is failed at Step 3 with following log in kylin.log
>>
>>         I have run the query which kylin provides on webui of cube on
>>hive and it works.
>>
>>         Please let me know whats going wrong ? Any more info required
>>from me please let me know.
>>
>>
>>
>>java.lang.NullPointerException
>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>>
>>
>>
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(Snapsh
>>o
>>tManager.java:156)] - Loading snapshotTable from
>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot,
>>with loadData: false
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnapsh
>>o
>>t(SnapshotManager.java:90)] - Identical input FileSignature
>>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>>lastModifiedTime=1425039202000], reuse existing snapshot at
>>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>>kylin_metadata_qa@hbase<ma...@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSour
>>c
>>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCubeD
>>e
>>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCube
>>D
>>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInver
>>t
>>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>>from folder
>>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInve
>>r
>>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index Desc(s)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.process
>>S
>>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.ru
>>n
>>(CreateDictionaryJob.java:55)] -
>>java.lang.NullPointerException
>> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>> at
>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGe
>>n
>>eratorCLI.java:60)
>> at
>>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGe
>>n
>>eratorCLI.java:39)
>> at
>>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob
>>.
>>java:51)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
>> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>> at
>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:
>>5
>>73)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutp
>>u
>>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource
>>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>>kylin_metadata_qa@hbase<ma...@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRe
>>s
>>ource(ResourceStore.java:166)] - Saving resource
>>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>>kylin_metadata_qa@hbase<ma...@hbase>)
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:87)] - Job status for
>>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>>updated.
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_column
>>s
>>[QuartzScheduler_Worker-10]:[2015-03-04
>>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNod
>>e
>>.java:89)] - output:Start to execute command:
>> -cubename NDim -segmentname FULL_BUILD -input
>>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_column
>>s
>>Command execute return code 2
>>
>>
>>
>>Regards,
>>Santosh Akhilesh
>>Bangalore R&D
>>HUAWEI TECHNOLOGIES CO.,LTD.
>>
>>www.huawei.com
>>-------------------------------------------------------------------------
>>-
>>-----------------------------------------------------------
>>This e-mail and its attachments contain confidential information from
>>HUAWEI, which
>>is intended only for the person or entity whose address is listed above.
>>Any use of the
>>information contained herein in any way (including, but not limited to,
>>total or partial
>>disclosure, reproduction, or dissemination) by persons other than the
>>intended
>>recipient(s) is prohibited. If you receive this e-mail in error, please
>>notify the sender by
>>phone or email immediately and delete it!


RE: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by Santoshakhilesh <sa...@huawei.com>.
Hi Shaofeng ,
      
      I had deleted the cube and tried to build again for hierarchy, now it fails in first step itself.
      I ahve three dimension tables 
      a) customer_dim b_ store_dim c) item_dim
      I choose left join with fact table while cerating dimensions , but after cube creation left join on sotre_dim is automatically deleted by kylin
      store dimension has fields stoireid , city , state , I had tried to add the hierarchy dimension (1) State and (2) City.

      I have started facing many issues , even with normal dimensions the kylin query result not matching with hive query.
      I had also tried to delete all the metadata in hive and hbase (i had deleted all the entries) and started from beginning by creating a new project but now problem is persisting. 

      Do you suggest me to install binary distribution is it stable enough now ? If yes how do I make sure all the previous data is deleted. deleting hive and hbase data is enough or I should delete something else too ?

      Logs are as below. Sorry for long mail.
      
       



JSON:
{
  "uuid": "5a5adb86-202a-4e2b-9be7-de421d3bdf2f",
  "name": "Hierarchy",
  "description": "",
  "dimensions": [
    {
      "id": 1,
      "name": "AREA",
      "join": null,
      "hierarchy": [
        {
          "level": "1",
          "column": "STATE"
        },
        {
          "level": "2",
          "column": "CITY"
        }
      ],
      "table": "STORE_DIM",
      "column": null,
      "datatype": null,
      "derived": null
    },
    {
      "id": 2,
      "name": "CUSTOMER_DIM_DERIVED",
      "join": {
        "type": "left",
        "primary_key": [
          "CUSTOMERID"
        ],
        "foreign_key": [
          "CUSTOMERID"
        ]
      },
      "hierarchy": null,
      "table": "CUSTOMER_DIM",
      "column": "{FK}",
      "datatype": null,
      "derived": [
        "NAME"
      ]
    },
    {
      "id": 3,
      "name": "ITEM_DIM_DERIVED",
      "join": {
        "type": "left",
        "primary_key": [
          "ITEMID"
        ],
        "foreign_key": [
          "ITEMID"
        ]
      },
      "hierarchy": null,
      "table": "ITEM_DIM",
      "column": "{FK}",
      "datatype": null,
      "derived": [
        "TYPE",
        "BRAND",
        "COLOR"
      ]
    }
  ],
  "measures": [
    {
      "id": 1,
      "name": "_COUNT_",
      "function": {
        "expression": "COUNT",
        "parameter": {
          "type": "constant",
          "value": "1"
        },
        "returntype": "bigint"
      },
      "dependent_measure_ref": null
    },
    {
      "id": 2,
      "name": "TOTALAMOUNT",
      "function": {
        "expression": "SUM",
        "parameter": {
          "type": "column",
          "value": "AMOUNT"
        },
        "returntype": "double"
      },
      "dependent_measure_ref": null
    },
    {
      "id": 3,
      "name": "TOTALQTY",
      "function": {
        "expression": "SUM",
        "parameter": {
          "type": "column",
          "value": "QTY"
        },
        "returntype": "int"
      },
      "dependent_measure_ref": null
    }
  ],
  "rowkey": {
    "rowkey_columns": [
      {
        "column": "STATE",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      },
      {
        "column": "CITY",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      },
      {
        "column": "CUSTOMERID",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      },
      {
        "column": "ITEMID",
        "length": 0,
        "dictionary": "true",
        "mandatory": false
      }
    ],
    "aggregation_groups": [
      [
        "CUSTOMERID",
        "ITEMID"
      ],
      [
        "STATE",
        "CITY"
      ]
    ]
  },
  "signature": "X6NQ6wZ9ZgvhBLqw0YAKhQ==",
  "capacity": "MEDIUM",
  "last_modified": 1425492494239,
  "fact_table": "SALES_FACT",
  "null_string": null,
  "filter_condition": null,
  "cube_partition_desc": {
    "partition_date_column": null,
    "partition_date_start": 0,
    "cube_partition_type": "APPEND"
  },
  "hbase_mapping": {
    "column_family": [
      {
        "name": "F1",
        "columns": [
          {
            "qualifier": "M",
            "measure_refs": [
              "_COUNT_",
              "TOTALAMOUNT",
              "TOTALQTY"
            ]
          }
        ]
      }
    ]
  },
  "notify_list": []
}

Logs:

15/03/05 02:09:18 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
15/03/05 02:09:18 WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist
Logging initialized using configuration in jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-common-0.14.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hive/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
OK
Time taken: 0.578 seconds
OK
Time taken: 0.444 seconds
FAILED: SemanticException [Error 10004]: Line 4:0 Invalid table alias or column reference 'STORE_DIM': (possible column names are: sales_fact.storeid, sales_fact.itemid, sales_fact.customerid, sales_fact.qty, sales_fact.amount, customer_dim.customerid, customer_dim.name, item_dim.itemid, item_dim.type, item_dim.brand, item_dim.color)



Regards,
Santosh Akhilesh
Bangalore R&D
HUAWEI TECHNOLOGIES CO.,LTD.

www.huawei.com
-------------------------------------------------------------------------------------------------------------------------------------
This e-mail and its attachments contain confidential information from HUAWEI, which
is intended only for the person or entity whose address is listed above. Any use of the
information contained herein in any way (including, but not limited to, total or partial
disclosure, reproduction, or dissemination) by persons other than the intended
recipient(s) is prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!

________________________________________
From: Shi, Shaofeng [shaoshi@ebay.com]
Sent: Wednesday, March 04, 2015 5:36 PM
To: dev@kylin.incubator.apache.org
Subject: Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

It seems that you have a lookup table which doesn¹t define the join
relationship; Could you paste the full json of this cube definition?

On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:

>Dear All ,
>
>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>defining normal and derived measures and play with it.
>
>         I have defined a new cube to test hierarchial dimensions and
>cube build is failed at Step 3 with following log in kylin.log
>
>         I have run the query which kylin provides on webui of cube on
>hive and it works.
>
>         Please let me know whats going wrong ? Any more info required
>from me please let me know.
>
>
>
>java.lang.NullPointerException
> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>
>
>
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(Snapsho
>tManager.java:156)] - Loading snapshotTable from
>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot,
>with loadData: false
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnapsho
>t(SnapshotManager.java:90)] - Identical input FileSignature
>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>lastModifiedTime=1425039202000], reuse existing snapshot at
>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>kylin_metadata_qa@hbase<ma...@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCubeDe
>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCubeD
>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInvert
>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>from folder
>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInver
>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index Desc(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.processS
>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run
>(CreateDictionaryJob.java:55)] -
>java.lang.NullPointerException
> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
> at
>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGen
>eratorCLI.java:60)
> at
>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGen
>eratorCLI.java:39)
> at
>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.
>java:51)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> at
>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:5
>73)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutpu
>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource
>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>kylin_metadata_qa@hbase<ma...@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource
>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>kylin_metadata_qa@hbase<ma...@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:87)] - Job status for
>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>updated.
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:89)] - output:Start to execute command:
> -cubename NDim -segmentname FULL_BUILD -input
>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
>Command execute return code 2
>
>
>
>Regards,
>Santosh Akhilesh
>Bangalore R&D
>HUAWEI TECHNOLOGIES CO.,LTD.
>
>www.huawei.com
>--------------------------------------------------------------------------
>-----------------------------------------------------------
>This e-mail and its attachments contain confidential information from
>HUAWEI, which
>is intended only for the person or entity whose address is listed above.
>Any use of the
>information contained herein in any way (including, but not limited to,
>total or partial
>disclosure, reproduction, or dissemination) by persons other than the
>intended
>recipient(s) is prohibited. If you receive this e-mail in error, please
>notify the sender by
>phone or email immediately and delete it!

Re: Cube Build failed at Step 3 , When I choose Hierarchial dimension

Posted by "Shi, Shaofeng" <sh...@ebay.com>.
It seems that you have a lookup table which doesn¹t define the join
relationship; Could you paste the full json of this cube definition?

On 3/4/15, 3:28 PM, "Santoshakhilesh" <sa...@huawei.com> wrote:

>Dear All ,
>
>         I am using 0.6.5 branch of Kylin. I was able to build a cube
>defining normal and derived measures and play with it.
>
>         I have defined a new cube to test hierarchial dimensions and
>cube build is failed at Step 3 with following log in kylin.log
>
>         I have run the query which kylin provides on webui of cube on
>hive and it works.
>
>         Please let me know whats going wrong ? Any more info required
>from me please let me know.
>
>
>
>java.lang.NullPointerException
> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
>
>
>
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,025][INFO][com.kylinolap.dict.lookup.SnapshotManager.load(Snapsho
>tManager.java:156)] - Loading snapshotTable from
>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot,
>with loadData: false
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,031][INFO][com.kylinolap.dict.lookup.SnapshotManager.buildSnapsho
>t(SnapshotManager.java:90)] - Identical input FileSignature
>[path=file:/hive/warehouse/store_dim/stores.txt, size=60,
>lastModifiedTime=1425039202000], reuse existing snapshot at
>/table_snapshot/stores.txt/780e5ebc-95d5-4d87-bd8d-7e561f19283a.snapshot
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,031][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource /cube/NDim.json (Store
>kylin_metadata_qa@hbase<ma...@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,035][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTable(MetadataManager.java:258)] - Reloading SourceTable from folder
>kylin_metadata_qa(key='/table')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTable(MetadataManager.java:267)] - Loaded 4 SourceTable(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,074][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTableExd(MetadataManager.java:243)] - Reloading SourceTable exd info
>from folder kylin_metadata_qa(key='/table_exd')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,104][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllSourc
>eTableExd(MetadataManager.java:253)] - Loaded 4 SourceTable EXD(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,104][INFO][com.kylinolap.metadata.MetadataManager.reloadAllCubeDe
>sc(MetadataManager.java:308)] - Reloading Cube Metadata from folder
>kylin_metadata_qa(key='/cube_desc')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,143][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllCubeD
>esc(MetadataManager.java:333)] - Loaded 4 Cube(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,143][INFO][com.kylinolap.metadata.MetadataManager.reloadAllInvert
>edIndexDesc(MetadataManager.java:356)] - Reloading Inverted Index Desc
>from folder 
>kylin_metadata_qa(key='/invertedindex_desc')@kylin_metadata_qa@hbase
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,147][DEBUG][com.kylinolap.metadata.MetadataManager.reloadAllInver
>tedIndexDesc(MetadataManager.java:381)] - Loaded 0 Inverted Index Desc(s)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,158][INFO][com.kylinolap.cube.cli.DictionaryGeneratorCLI.processS
>egment(DictionaryGeneratorCLI.java:59)] - Checking snapshot of STORE_DIM
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,159][ERROR][com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run
>(CreateDictionaryJob.java:55)] -
>java.lang.NullPointerException
> at com.kylinolap.cube.CubeManager.getLookupTable(CubeManager.java:424)
> at 
>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGen
>eratorCLI.java:60)
> at 
>com.kylinolap.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGen
>eratorCLI.java:39)
> at 
>com.kylinolap.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.
>java:51)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at com.kylinolap.job.cmd.JavaHadoopCmd.execute(JavaHadoopCmd.java:54)
> at com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode.java:77)
> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> at 
>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:5
>73)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,159][DEBUG][com.kylinolap.job.cmd.JavaHadoopCmdOutput.appendOutpu
>t(JavaHadoopCmdOutput.java:96)] - Command execute return code 2
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,166][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource
>/job_output/70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 (Store
>kylin_metadata_qa@hbase<ma...@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,174][DEBUG][com.kylinolap.common.persistence.ResourceStore.putRes
>ource(ResourceStore.java:166)] - Saving resource
>/job/70f7dfe5-f414-4643-a014-3ba5c5d3ab22 (Store
>kylin_metadata_qa@hbase<ma...@hbase>)
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:87)] - Job status for
>cube_job_group.NDim.70f7dfe5-f414-4643-a014-3ba5c5d3ab22.2 has been
>updated.
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:88)] - cmd: -cubename NDim -segmentname FULL_BUILD -input
>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
>[QuartzScheduler_Worker-10]:[2015-03-04
>19:00:49,177][INFO][com.kylinolap.job.flow.JobFlowNode.execute(JobFlowNode
>.java:89)] - output:Start to execute command:
> -cubename NDim -segmentname FULL_BUILD -input
>/tmp/kylin-70f7dfe5-f414-4643-a014-3ba5c5d3ab22/NDim/fact_distinct_columns
>Command execute return code 2
>
>
>
>Regards,
>Santosh Akhilesh
>Bangalore R&D
>HUAWEI TECHNOLOGIES CO.,LTD.
>
>www.huawei.com
>--------------------------------------------------------------------------
>-----------------------------------------------------------
>This e-mail and its attachments contain confidential information from
>HUAWEI, which
>is intended only for the person or entity whose address is listed above.
>Any use of the
>information contained herein in any way (including, but not limited to,
>total or partial
>disclosure, reproduction, or dissemination) by persons other than the
>intended
>recipient(s) is prohibited. If you receive this e-mail in error, please
>notify the sender by
>phone or email immediately and delete it!