You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by nashcen <24...@qq.com> on 2020/09/24 03:00:42 UTC

Flink-1.11 sql-client yaml 配置问题

准备通过 命令行工具 $FLINK_HOME/bin/sql-client.sh embedded
登录 Flink SQL 客户端 去连接 Hive,


我在 Flink-SQL 的配置文件 sql-client-defaults.yaml 里,
加入了以下参数
catalogs:
  - name: myhive
    type: hive
    hive-conf-dir:
/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hive/conf
    default-database: dc_stg

启动报错,以下是报错信息:

Reading default environment from:
file:/bigdata/athub/app/bigdata/flink/flink-1.11.1/conf/sql-client-defaults.yaml
No session environment specified.


Exception in thread "main" org.apache.flink.table.client.SqlClientException:
Unexpected exception. This is a bug. Please consider filing an issue.
	at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
Could not create execution context.
	at
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
	at
org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
	at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
	at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could
not find a suitable table factory for
'org.apache.flink.table.factories.CatalogFactory' in
the classpath.

Reason: Required context properties mismatch.

The following properties are requested:
default-database=dc_stg
hive-conf-dir=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hive/conf
type=hive

The following factories have been considered:
org.apache.flink.table.catalog.GenericInMemoryCatalogFactory
	at
org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322)
	at
org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190)
	at
org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143)
	at
org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:113)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:377)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:626)
	at java.util.HashMap.forEach(HashMap.java:1289)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
	at
org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
	... 3 more



--
Sent from: http://apache-flink.147419.n8.nabble.com/

Re: Flink-1.11 sql-client yaml 配置问题

Posted by Rui Li <li...@gmail.com>.
你好,这个感觉是缺少hive connector的依赖,lib下面添加了哪些jar呢?

On Thu, Sep 24, 2020 at 11:00 AM nashcen <24...@qq.com> wrote:

> 准备通过 命令行工具 $FLINK_HOME/bin/sql-client.sh embedded
> 登录 Flink SQL 客户端 去连接 Hive,
>
>
> 我在 Flink-SQL 的配置文件 sql-client-defaults.yaml 里,
> 加入了以下参数
> catalogs:
>   - name: myhive
>     type: hive
>     hive-conf-dir:
> /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hive/conf
>     default-database: dc_stg
>
> 启动报错,以下是报错信息:
>
> Reading default environment from:
>
> file:/bigdata/athub/app/bigdata/flink/flink-1.11.1/conf/sql-client-defaults.yaml
> No session environment specified.
>
>
> Exception in thread "main"
> org.apache.flink.table.client.SqlClientException:
> Unexpected exception. This is a bug. Please consider filing an issue.
>         at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
> Could not create execution context.
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
>         at
>
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
>         at
> org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
>         at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException:
> Could
> not find a suitable table factory for
> 'org.apache.flink.table.factories.CatalogFactory' in
> the classpath.
>
> Reason: Required context properties mismatch.
>
> The following properties are requested:
> default-database=dc_stg
>
> hive-conf-dir=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hive/conf
> type=hive
>
> The following factories have been considered:
> org.apache.flink.table.catalog.GenericInMemoryCatalogFactory
>         at
>
> org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322)
>         at
>
> org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190)
>         at
>
> org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143)
>         at
>
> org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:113)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:377)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:626)
>         at java.util.HashMap.forEach(HashMap.java:1289)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
>         at
>
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
>         ... 3 more
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/
>


-- 
Best regards!
Rui Li