You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2021/04/23 10:01:09 UTC

[GitHub] [iceberg] DongTL opened a new issue #2507: Stack map does not match the one at exception handler

DongTL opened a new issue #2507:
URL: https://github.com/apache/iceberg/issues/2507


   flink version: v1.11.3
   hive version: v2.1.1_cdh6.3.1
   
   run sql-client
   ````
   sql-client.sh embedded -j /.../iceberg-flink-runtime-0.10.0.jar  -j /.../flink-sql-connector-hive-2.2.0_2.11-1.11.3.jar shell
   ````
   
   execute:
   ````
   CREATE CATALOG hive_catalog WITH (
     'type'='iceberg',
     'catalog-type'='hive',
     'uri'='thrift://vplc01:9083',
     'clients'='5',
     'property-version'='1',
     'hive-conf-dir'='/etc/alternatives/hive-conf/'
   );
   ````
   
   exception:
   ````
   Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
           at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
   Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
           at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
           at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
           at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
           at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
   Caused by: java.lang.VerifyError: Stack map does not match the one at exception handler 18
   Exception Details:
     Location:
       org/apache/iceberg/hive/HiveCatalog.alterHiveDataBase(Lorg/apache/iceberg/catalog/Namespace;Lorg/apache/hadoop/hive/metastore/api/Database;)V @18: astore_3
     Reason:
       Type 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' (current frame, stack[0]) is not assignable to 'org/apache/thrift/TException' (stack map, stack[0])
     Current Frame:
       bci: @0
       flags: { }
       locals: { 'org/apache/iceberg/hive/HiveCatalog', 'org/apache/iceberg/catalog/Namespace', 'org/apache/hadoop/hive/metastore/api/Database' }
       stack: { 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' }
     Stackmap Frame:
       bci: @18
       flags: { }
       locals: { 'org/apache/iceberg/hive/HiveCatalog', 'org/apache/iceberg/catalog/Namespace', 'org/apache/hadoop/hive/metastore/api/Database' }
       stack: { 'org/apache/thrift/TException' }
     Bytecode:
       0x0000000: 2ab4 0038 2b2c ba02 2800 00b6 009a 57a7
       0x0000010: 0065 4ebb 00c7 592d 12c9 04bd 00cb 5903
       0x0000020: 2b53 b702 29bf 4ebb 00d0 59bb 00d2 59b7
       0x0000030: 00d3 1302 2bb6 00d9 2bb6 00dc 1301 a4b6
       0x0000040: 00d9 b600 e02d b700 e3bf 4eb8 0040 b600
       0x0000050: e6bb 00d0 59bb 00d2 59b7 00d3 1302 2db6
       0x0000060: 00d9 2bb6 00dc 1301 a4b6 00d9 b600 e02d
       0x0000070: b700 e3bf b1
     Exception Handler Table:
       bci [0, 15] => handler: 18
       bci [0, 15] => handler: 18
       bci [0, 15] => handler: 38
       bci [0, 15] => handler: 74
     Stackmap Table:
       same_locals_1_stack_item_frame(@18,Object[#111])
       same_locals_1_stack_item_frame(@38,Object[#111])
       same_locals_1_stack_item_frame(@74,Object[#113])
       same_frame(@116)
   
           at org.apache.iceberg.flink.CatalogLoader$HiveCatalogLoader.loadCatalog(CatalogLoader.java:95)
           at org.apache.iceberg.flink.FlinkCatalog.<init>(FlinkCatalog.java:104)
           at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:132)
           at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:122)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:378)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:626)
           at java.util.HashMap.forEach(HashMap.java:1289)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
           at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
           at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
           ... 3 more
   ````
   
   #2057 can not solve the problem,
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org