You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2022/01/19 07:18:03 UTC

[GitHub] [iceberg] George-zqq opened a new issue #3922: Required context of factory 'org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory' must not be null.

George-zqq opened a new issue #3922:
URL: https://github.com/apache/iceberg/issues/3922


   When i run bin/sql-client.sh , cause a exception :
   
   Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set.
   SLF4J: Class path contains multiple SLF4J bindings.
   SLF4J: Found binding in [jar:file:/opt/flink-1.13.5/lib/log4j-slf4j-impl-2.16.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.1-1.cdh6.3.1.p0.1470567/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
   SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
   No default environment specified.
   Searching for '/opt/flink-1.13.5/conf/sql-client-defaults.yaml'...found.
   Reading default environment from: file:/opt/flink-1.13.5/conf/sql-client-defaults.yaml
   
   
   Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
   	at org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:201)
   	at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
   Caused by: org.apache.flink.table.api.TableException: Required context of factory 'org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory' must not be null.
   	at org.apache.flink.table.factories.TableFactoryService.normalizeContext(TableFactoryService.java:340)
   	at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:235)
   	at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:178)
   	at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:139)
   	at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:108)
   	at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:243)
   	at org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.createCatalog(LegacyTableEnvironmentInitializer.java:217)
   	at org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.lambda$initializeCatalogs$1(LegacyTableEnvironmentInitializer.java:120)
   	at java.util.HashMap.forEach(HashMap.java:1289)
   	at org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeCatalogs(LegacyTableEnvironmentInitializer.java:117)
   	at org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeSessionState(LegacyTableEnvironmentInitializer.java:105)
   	at org.apache.flink.table.client.gateway.context.SessionContext.create(SessionContext.java:233)
   	at org.apache.flink.table.client.gateway.local.LocalContextUtils.buildSessionContext(LocalContextUtils.java:100)
   	at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:91)
   	at org.apache.flink.table.client.SqlClient.start(SqlClient.java:88)
   	at org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
   	... 1 more
   
   
   
   _______________________
   - Hive Version:  hive-2.1.1-CDH6.3.1
   - FLink Version:  1.13.5
   - FLink sql conf sql-client-defaults.yaml :
    execution:
       type: streaming
       current-catalog: myhive  # set the HiveCatalog as the current catalog of the session
       current-database: mydatabase
    catalogs:
       name: myhive
        type: hive
        hive-conf-dir: /etc/hive/conf  # contains hive-site.xml
        hive-version: 2.1.1
   
   - flink libs in $FLINK_HIME/lib:
   
   antlr-runtime-3.5.2.jar
   flink-annotations-1.13.5.jar
   flink-connector-flume_2.10-1.0.jar
   flink-connector-hive_2.11-1.13.5.jar
   flink-connector-jdbc_2.11-1.13.5.jar
   flink-connector-kafka_2.11-1.13.5.jar
   flink-csv-1.13.5.jar
   flink-dist_2.11-1.13.5.jar
   flink-format-changelog-json-2.1.1.jar
   flink-json-1.13.5.jar
   flink-python_2.11-1.13.5.jar
   flink-shaded-zookeeper-3-3.4.14-14.0.jar
   flink-shaded-zookeeper-3.4.14.jar
   flink-sql-avro-1.13.5.jar
   flink-sql-avro-confluent-registry-1.13.5.jar
   flink-sql-connector-hbase-2.2_2.11-1.13.5.jar
   flink-sql-connector-hive-2.2.0_2.11-1.12.3.jar
   flink-sql-connector-kafka_2.11-1.13.5.jar
   flink-streaming-scala_2.11-1.13.5.jar
   flink-table_2.11-1.13.5.jar
   flink-table-api-java-bridge_2.11-1.13.5.jar
   flink-table-blink_2.11-1.13.5.jar
   hive-exec-2.1.1-cdh6.3.1.jar
   log4j-1.2-api-2.16.0.jar
   log4j-api-2.16.0.jar
   log4j-core-2.16.0.jar
   log4j-slf4j-impl-2.16.0.jar
   ____________
   
   I just want to run a finksql demo with hivecatalogs , but it throw an exception above.
   How i fix it ?
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] George-zqq closed issue #3922: Required context of factory 'org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory' must not be null.

Posted by GitBox <gi...@apache.org>.
George-zqq closed issue #3922:
URL: https://github.com/apache/iceberg/issues/3922


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org