You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by Zhou Zach <wa...@163.com> on 2020/07/14 11:52:10 UTC

flink1.11 sink hive error

hi all,
flink1.11 sql sink hive table 报错:


java.util.concurrent.CompletionException: org.apache.flink.client.deployment.application.ApplicationExecutionException: Could not execute application.
	at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:292) ~[?:1.8.0_161]
	at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:308) ~[?:1.8.0_161]
	at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:943) ~[?:1.8.0_161]
	at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926) ~[?:1.8.0_161]
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474) ~[?:1.8.0_161]
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977) ~[?:1.8.0_161]
	at org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.runApplicationEntryPoint(ApplicationDispatcherBootstrap.java:245) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.lambda$runApplicationAsync$1(ApplicationDispatcherBootstrap.java:199) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_161]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at org.apache.flink.runtime.concurrent.akka.ActorSystemScheduledExecutorAdapter$ScheduledFutureTask.run(ActorSystemScheduledExecutorAdapter.java:154) [data-flow-1.0.jar:?]
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) [qile-data-flow-1.0.jar:?]
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44) [data-flow-1.0.jar:?]
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [qile-data-flow-1.0.jar:?]
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [data-flow-1.0.jar:?]
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [data-flow-1.0.jar:?]
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [data-flow-1.0.jar:?]
Caused by: org.apache.flink.client.deployment.application.ApplicationExecutionException: Could not execute application.
	... 11 more
Caused by: org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Embedded metastore is not allowed. Make sure you have set a valid value for hive.metastore.uris
	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.runApplicationEntryPoint(ApplicationDispatcherBootstrap.java:230) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	... 10 more
Caused by: java.lang.IllegalArgumentException: Embedded metastore is not allowed. Make sure you have set a valid value for hive.metastore.uris
	at org.apache.flink.util.Preconditions.checkArgument(Preconditions.java:139) ~[data-flow-1.0.jar:?]
	at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:171) ~[flink-sql-connector-hive-2.2.0_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:157) ~[flink-sql-connector-hive-2.2.0_2.11-1.11.0.jar:1.11.0]
	at cn.ibobei.qile.dataflow.sql.FromKafkaSinkHiveAndHbase$.main(FromKafkaSinkHiveAndHbase.scala:27) ~[data-flow-1.0.jar:?]
	at cn.ibobei.qile.dataflow.sql.FromKafkaSinkHiveAndHbase.main(FromKafkaSinkHiveAndHbase.scala) ~[data-flow-1.0.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_161]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_161]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_161]
	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	at org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.runApplicationEntryPoint(ApplicationDispatcherBootstrap.java:230) ~[flink-clients_2.11-1.11.0.jar:1.11.0]
	... 10 more

Re:Re: flink1.11 sink hive error

Posted by Zhou Zach <wa...@163.com>.
Hi,


我刚才把flink sink的hive table,hive hdfs目录都删了,hbase表数据也清空了(hbase 通过hue hive table方式查询),然后重启程序,就可以了,
等再出问题,我试下你这种方法,感谢答疑!

















在 2020-07-14 20:42:16,"Leonard Xu" <xb...@gmail.com> 写道:
>Hi,
>你安装 hive 的 metastore 后,在你 hivehome/conf/hive-site.xml 文件中添加这样一个配置:
>  <property>
>    <name>hive.metastore.uris</name>
>    <value>thrift://xxxx:9083</value>
>    <description>Thrift URI for the remote metastore. Used by metastore client to connect to remote metastore.</description>
>  </property>
>一般生产环境应该也是这样配置,
>然后 Flink 对接到hive配置参考[1],应该和你之前用的没啥变化,就是不支持 embedded 的 metastore
>
>祝好,
>Leonard Xu
>[1] https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#connecting-to-hive <https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#connecting-to-hive>
>
>> 在 2020年7月14日,20:29,Zhou Zach <wa...@163.com> 写道:
>> 
>> Hi,
>> 
>> 
>> 是在flink的conf文件配置hive.metastore.uris吗
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 在 2020-07-14 20:03:11,"Leonard Xu" <xb...@gmail.com> 写道:
>>> Hello
>>> 
>>> 
>>>> 在 2020年7月14日,19:52,Zhou Zach <wa...@163.com> 写道:
>>>> 
>>>> : Embedded metastore is not allowed.
>>> 
>>> Flink 集成 Hive 时,不支持 embedded metastore 的, 你需要起一个hive metastore 并在conf文件配置 hive.metastore.uris, 支持的 metastore 版本 参考[1]
>>> 
>>> Best,
>>> Leonard Xu
>>> [1] https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar <https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar>
>

Re: flink1.11 sink hive error

Posted by Leonard Xu <xb...@gmail.com>.
Hi,
你安装 hive 的 metastore 后,在你 hivehome/conf/hive-site.xml 文件中添加这样一个配置:
  <property>
    <name>hive.metastore.uris</name>
    <value>thrift://xxxx:9083</value>
    <description>Thrift URI for the remote metastore. Used by metastore client to connect to remote metastore.</description>
  </property>
一般生产环境应该也是这样配置,
然后 Flink 对接到hive配置参考[1],应该和你之前用的没啥变化,就是不支持 embedded 的 metastore

祝好,
Leonard Xu
[1] https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#connecting-to-hive <https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#connecting-to-hive>

> 在 2020年7月14日,20:29,Zhou Zach <wa...@163.com> 写道:
> 
> Hi,
> 
> 
> 是在flink的conf文件配置hive.metastore.uris吗
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 在 2020-07-14 20:03:11,"Leonard Xu" <xb...@gmail.com> 写道:
>> Hello
>> 
>> 
>>> 在 2020年7月14日,19:52,Zhou Zach <wa...@163.com> 写道:
>>> 
>>> : Embedded metastore is not allowed.
>> 
>> Flink 集成 Hive 时,不支持 embedded metastore 的, 你需要起一个hive metastore 并在conf文件配置 hive.metastore.uris, 支持的 metastore 版本 参考[1]
>> 
>> Best,
>> Leonard Xu
>> [1] https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar <https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar>


Re:Re: flink1.11 sink hive error

Posted by Zhou Zach <wa...@163.com>.
Hi,


是在flink的conf文件配置hive.metastore.uris吗

















在 2020-07-14 20:03:11,"Leonard Xu" <xb...@gmail.com> 写道:
>Hello
>
>
>> 在 2020年7月14日,19:52,Zhou Zach <wa...@163.com> 写道:
>> 
>> : Embedded metastore is not allowed.
>
>Flink 集成 Hive 时,不支持 embedded metastore 的, 你需要起一个hive metastore 并在conf文件配置 hive.metastore.uris, 支持的 metastore 版本 参考[1]
>
>Best,
>Leonard Xu
>[1] https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar <https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar>

Re: flink1.11 sink hive error

Posted by Leonard Xu <xb...@gmail.com>.
Hello


> 在 2020年7月14日,19:52,Zhou Zach <wa...@163.com> 写道:
> 
> : Embedded metastore is not allowed.

Flink 集成 Hive 时,不支持 embedded metastore 的, 你需要起一个hive metastore 并在conf文件配置 hive.metastore.uris, 支持的 metastore 版本 参考[1]

Best,
Leonard Xu
[1] https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar <https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar>