You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by "casel.chen" <ca...@126.com> on 2021/06/19 05:35:49 UTC

flink hudi写oss文件失败报No FileSystem for scheme "oss"

hadoop 2.9.2, flink 1.12.2, hudi 0.9.0-SNAPSHOT
core-site.xml里面配置了oss相关信息,本地启flink cluster,执行flink sql client创建表,写数据和查询都没问题。


改成在项目中flink sql作业,打包成fat jar以local方式运行,项目中引用了 flink-oss-fs-hadoop,但程序报了如下错误
Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "oss"
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3281)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3301)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
at org.apache.hudi.common.fs.FSUtils.getFs(FSUtils.java:97)
... 41 more
查看了dependency tree是有flink-oss-fs-hadoop jar包的,并且解析fat jar也能看到 org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.aliyun.oss.AliyunOSSFileSystem 这个shaded类
但是使用arthas查找运行时的类却没有找到这个类。


以fat jar运行flink sql作业的启动命令如下:
/Users/admin/.sdkman/candidates/java/current/bin/java -cp /Users/admin/gitrepo/rtdp/streamsql/dist/lib/streamsql-submit-jar-with-dependencies.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/flink-shaded-zookeeper-3.4.14.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/flink-oss-fs-hadoop-1.12.2.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/flink-csv-1.12.2.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/flink-table_2.11-1.12.2.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/flink-json-1.12.2.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/flink-dist_2.11-1.12.2.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/hudi-flink-bundle_2.11-0.9.0-SNAPSHOT.jar:/Users/admin/dev/hudi-demo/flink-1.12.2/lib/flink-table-blink_2.11-1.12.2.jar com.huifu.streamsql.launcher.SubmitJobMain -command start -jobPath /Users/admin/gitrepo/rtdp/streamsql/dist/jobs/test/hudi/1-hudi-insert


使用arthas查询类


[INFO] arthas home: /Users/admin/.arthas/lib/3.5.1/arthas
[INFO] Try to attach process 25267
[INFO] Attach process 25267 success.
[INFO] arthas-client connect 127.0.0.1 3658
  ,---.  ,------. ,--------.,--.  ,--.  ,---.   ,---.                           
 /  O  \ |  .--. ''--.  .--'|  '--'  | /  O  \ '   .-'                          
|  .-.  ||  '--'.'   |  |   |  .--.  ||  .-.  |`.  `-.                          
|  | |  ||  |\  \    |  |   |  |  |  ||  | |  |.-'    |                         
`--' `--'`--' '--'   `--'   `--'  `--'`--' `--'`-----'                          
                                                                                


wiki       https://arthas.aliyun.com/doc                                        
tutorials  https://arthas.aliyun.com/doc/arthas-tutorials.html                  
version    3.5.1                                                                
main_class com.huifu.streamsql.launcher.SubmitJobMain                           
pid        25267                                                                
time       2021-06-19 11:57:22                                                  


[arthas@25267]$ sc org.apache.flink.fs.*
Affect(row-cnt:0) cost in 19 ms.
[arthas@25267]$ sc org.apache.hadoop.fs.*
Affect(row-cnt:0) cost in 5 ms.


另有一个疑问:flink hudi写oss一定要依赖hadoop home吗?需要安装hadoop目录?可不可以只将core-site.xml打包进fat jar运行呢?