You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "XiaoXiang Yu (JIRA)" <ji...@apache.org> on 2019/06/19 10:28:00 UTC

[jira] [Commented] (KYLIN-4051) build cube met the InvalidProtocolBufferException

    [ https://issues.apache.org/jira/browse/KYLIN-4051?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16867476#comment-16867476 ] 

XiaoXiang Yu commented on KYLIN-4051:
-------------------------------------

Hi, [~xiangakun], If you don't mind, I suggest you upgrade to latest stable version such as 2.5.x, and maybe use a higher version of CDH cluster, such as 5.7 to 5.15. 

> build cube met the InvalidProtocolBufferException
> -------------------------------------------------
>
>                 Key: KYLIN-4051
>                 URL: https://issues.apache.org/jira/browse/KYLIN-4051
>             Project: Kylin
>          Issue Type: Bug
>          Components: Job Engine
>    Affects Versions: v2.1.0
>         Environment: Kylin 2.1, hbase 1.2.0-cdh5.10.1, hadoop-2.5.0-cdh5.3.2, hive 1.2.1
>            Reporter: xiangakun
>            Priority: Major
>              Labels: easyfix
>
> *Dears,*
> *When I tried to build a cube, met the following errors, does anyone meet the same error before, hope to get your feedback soon, thanks in advance~*
> Error: java.io.IOException: java.lang.reflect.InvocationTargetException
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:295)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:242)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:356)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:591)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:438)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at java.security.AccessController.doPrivileged(Native Method)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at javax.security.auth.Subject.doAs(Subject.java:422)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
> 2019-06-19 15:57:05,569 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : Caused by: java.lang.reflect.InvocationTargetException
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:281)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : ... 11 more
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : {color:#FF0000}*Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: streams[3].kind, streams[8].kind, streams[11].kind, streams[14*{color}
> {color:#FF0000}*].kind, streams[21].kind, streams[24].kind, streams[28].kind*{color}
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at com.google.protobuf.AbstractParser.checkMessageInitialized(AbstractParser.java:71)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.OrcProto$StripeFooter.parseFrom(OrcProto.java:8878)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.readStripeFooter(RecordReaderImpl.java:2174)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.readStripe(RecordReaderImpl.java:2505)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.advanceStripe(RecordReaderImpl.java:2949)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.advanceToNextRow(RecordReaderImpl.java:2991)
> 2019-06-19 15:57:05,570 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.<init>(RecordReaderImpl.java:284)
> 2019-06-19 15:57:05,571 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.rowsOptions(ReaderImpl.java:480)
> 2019-06-19 15:57:05,571 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.createReaderFromFile(OrcInputFormat.java:214)
> 2019-06-19 15:57:05,571 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.<init>(OrcInputFormat.java:146)
> 2019-06-19 15:57:05,571 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getRecordReader(OrcInputFormat.java:997)
> 2019-06-19 15:57:05,571 INFO [Job 5e7639c4-660a-46dd-852e-62bf3fc4a713-1559] hive.CreateFlatHiveTableStep:38 : at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65)
>  
>  
>  
>  
> followings is cube json
> {
>   "uuid": "1300457a-2ef9-415e-bf90-9fa310b71b26",
>   "last_modified": 1560930896836,
>   "version": "2.1.0",
>   "name": "sales_value_details_info_day",
>   "is_draft": false,
>   "model_name": "sales_value_details_info_day",
>   "description": "sales_value_details_info_day",
>   "null_string": null,
>   "dimensions": [
>     \{
>       "name": "PO_NO",
>       "table": "SALES_VALUE_DETAILS_INFO_DAY",
>       "column": "PO_NO",
>       "derived": null
>     },
>     \{
>       "name": "ITEM_CODE",
>       "table": "SALES_VALUE_DETAILS_INFO_DAY",
>       "column": "ITEM_CODE",
>       "derived": null
>     },
>     \{
>       "name": "ITEM_DESC",
>       "table": "SALES_VALUE_DETAILS_INFO_DAY",
>       "column": "ITEM_DESC",
>       "derived": null
>     },
>     \{
>       "name": "BRAND_CODE",
>       "table": "SALES_VALUE_DETAILS_INFO_DAY",
>       "column": "BRAND_CODE",
>       "derived": null
>     },
>     \{
>       "name": "BRAND_NAME",
>       "table": "SALES_VALUE_DETAILS_INFO_DAY",
>       "column": "BRAND_NAME",
>       "derived": null
>     },
>     \{
>       "name": "DT",
>       "table": "SALES_VALUE_DETAILS_INFO_DAY",
>       "column": "DT",
>       "derived": null
>     }
>   ],
>   "measures": [
>     \{
>       "name": "_COUNT_",
>       "function": {
>         "expression": "COUNT",
>         "parameter": {
>           "type": "constant",
>           "value": "1"
>         },
>         "returntype": "bigint"
>       }
>     },
>     \{
>       "name": "NET_SALES_COST",
>       "function": {
>         "expression": "SUM",
>         "parameter": {
>           "type": "column",
>           "value": "SALES_VALUE_DETAILS_INFO_DAY.NET_SALES_COST"
>         },
>         "returntype": "decimal(19,4)"
>       }
>     },
>     \{
>       "name": "NET_SALES_MONEY",
>       "function": {
>         "expression": "SUM",
>         "parameter": {
>           "type": "column",
>           "value": "SALES_VALUE_DETAILS_INFO_DAY.NET_SALES_MONEY"
>         },
>         "returntype": "decimal(19,4)"
>       }
>     },
>     \{
>       "name": "SALES_COST",
>       "function": {
>         "expression": "SUM",
>         "parameter": {
>           "type": "column",
>           "value": "SALES_VALUE_DETAILS_INFO_DAY.SALES_COST"
>         },
>         "returntype": "decimal(19,4)"
>       }
>     }
>   ],
>   "dictionaries": [],
>   "rowkey": \{
>     "rowkey_columns": [
>       {
>         "column": "SALES_VALUE_DETAILS_INFO_DAY.PO_NO",
>         "encoding": "dict",
>         "isShardBy": false
>       },
>       \{
>         "column": "SALES_VALUE_DETAILS_INFO_DAY.ITEM_CODE",
>         "encoding": "dict",
>         "isShardBy": false
>       },
>       \{
>         "column": "SALES_VALUE_DETAILS_INFO_DAY.ITEM_DESC",
>         "encoding": "dict",
>         "isShardBy": false
>       },
>       \{
>         "column": "SALES_VALUE_DETAILS_INFO_DAY.BRAND_CODE",
>         "encoding": "dict",
>         "isShardBy": false
>       },
>       \{
>         "column": "SALES_VALUE_DETAILS_INFO_DAY.BRAND_NAME",
>         "encoding": "dict",
>         "isShardBy": false
>       },
>       \{
>         "column": "SALES_VALUE_DETAILS_INFO_DAY.DT",
>         "encoding": "dict",
>         "isShardBy": false
>       }
>     ]
>   },
>   "hbase_mapping": \{
>     "column_family": [
>       {
>         "name": "F1",
>         "columns": [
>           {
>             "qualifier": "M",
>             "measure_refs": [
>               "_COUNT_",
>               "NET_SALES_COST",
>               "NET_SALES_MONEY",
>               "SALES_COST"
>             ]
>           }
>         ]
>       }
>     ]
>   },
>   "aggregation_groups": [
>     \{
>       "includes": [
>         "SALES_VALUE_DETAILS_INFO_DAY.PO_NO",
>         "SALES_VALUE_DETAILS_INFO_DAY.ITEM_CODE",
>         "SALES_VALUE_DETAILS_INFO_DAY.ITEM_DESC",
>         "SALES_VALUE_DETAILS_INFO_DAY.BRAND_CODE",
>         "SALES_VALUE_DETAILS_INFO_DAY.BRAND_NAME",
>         "SALES_VALUE_DETAILS_INFO_DAY.DT"
>       ],
>       "select_rule": {
>         "hierarchy_dims": [],
>         "mandatory_dims": [
>           "SALES_VALUE_DETAILS_INFO_DAY.DT"
>         ],
>         "joint_dims": []
>       }
>     }
>   ],
>   "signature": "xKEMvizWVPEl6EMWHB+SGg==",
>   "notify_list": [],
>   "status_need_notify": [
>     "ERROR",
>     "DISCARDED",
>     "SUCCEED"
>   ],
>   "partition_date_start": 1560643200000,
>   "partition_date_end": 3153600000000,
>   "auto_merge_time_ranges": [],
>   "retention_range": 0,
>   "cuboid_cut_size_mb": null,
>   "exclude_cuboid_layer": null,
>   "engine_type": 2,
>   "storage_type": 2,
>   "override_kylin_properties": \{
>     "kylin.engine.mr.config-override.fs.defaultFS": "hdfs://bipcluster",
>     "kylin.engine.mr.config-override.mapreduce.jobhistory.address": "sd-hadoop-journalnode-71-21.idc.vip.com:10020",
>     "kylin.engine.mr.config-override.mapreduce.jobhistory.webapp.address": "sd-hadoop-journalnode-71-21.idc.vip.com:19888"
>   },
>   "cuboid_black_list": [],
>   "parent_forward": 3
> }



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)