You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by Z-Z <zz...@qq.com> on 2020/07/08 02:59:20 UTC

Flink Hadoop依赖

Hi, 各位大佬们,有个问题,Flink 1.10.0版本中,已经在jobmanager的lib文件夹添加了flink-shaded-hadoop-2-uber-2.7.5-10.0.jar文件,通过webui上传可以正常运行任务,但通过cli命令,提交任务后报Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.有谁知道是怎么回事吗?

Re: Flink Hadoop依赖

Posted by Xintong Song <to...@gmail.com>.
你说的 “jobmanager的lib文件夹” 是指哪里?Flink 的部署方式是怎样的?CLI 运行在哪里?

Thank you~

Xintong Song



On Wed, Jul 8, 2020 at 10:59 AM Z-Z <zz...@qq.com> wrote:

> Hi, 各位大佬们,有个问题,Flink
> 1.10.0版本中,已经在jobmanager的lib文件夹添加了flink-shaded-hadoop-2-uber-2.7.5-10.0.jar文件,通过webui上传可以正常运行任务,但通过cli命令,提交任务后报Could
> not find a file system implementation for scheme 'hdfs'. The scheme is not
> directly supported by Flink and no Hadoop file system to support this
> scheme could be loaded.有谁知道是怎么回事吗?