You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by saksham sapra <sa...@gmail.com> on 2020/10/07 12:06:46 UTC

Flink Kuberntes Libraries

Hi ,

i have made some configuration using this link page :
https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html
.
and i am able to run flink on UI , but i need to submit a job using :
http://localhost:8001/api/v1/namespaces/default/services/flink-jobmanager:webui/proxy/#/submit
through POstman, and i have some libraries which in local i can add in libs
folder but in this how can i add my libraries so that it works properly.

[image: image.png]

Re: Flink Kuberntes Libraries

Posted by Till Rohrmann <tr...@apache.org>.
Hi Superainbower,

could you share the complete logs with us? They contain which Flink version
you are using and also the classpath you are starting the JVM with. Have
you tried whether the same problem occurs with the latest Flink version?

Cheers,
Till

On Mon, Oct 12, 2020 at 10:32 AM superainbower <su...@163.com>
wrote:

> Hi Till,
> Could u tell me how to configure HDFS as statebackend when I deploy flink
> on k8s?
> I try to add the following to flink-conf.yaml
>
> state.backend: rocksdb
> state.checkpoints.dir: hdfs://slave2:8020/flink/checkpoints
> state.savepoints.dir: hdfs://slave2:8020/flink/savepoints
> state.backend.incremental: true
>
> And add flink-shaded-hadoop2-2.8.3-1.8.3.jar to /opt/flink/lib
>
> But It doesn’t work and I got this error logs
>
> Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException:
> Could not find a file system implementation for scheme 'hdfs'. The scheme
> is not directly supported by Flink and no Hadoop file system to support
> this scheme could be loaded. For a full list of supported file systems,
> please see
> https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/.
>
> Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException:
> Cannot support file system for 'hdfs' via Hadoop, because Hadoop is not in
> the classpath, or some classes are missing from the classpath
>
> Caused by: java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.flink.runtime.util.HadoopUtils
> On 10/09/2020 22:13, Till Rohrmann <tr...@apache.org> wrote:
>
> Hi Saksham,
>
> if you want to extend the Flink Docker image you can find here more
> details [1].
>
> If you want to include the library in your user jar, then you have to add
> the library as a dependency to your pom.xml file and enable the shade
> plugin for building an uber jar [2].
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html#advanced-customization
> [2]
> https://maven.apache.org/plugins/maven-shade-plugin/examples/includes-excludes.html
>
> Cheers,
> Till
>
> On Fri, Oct 9, 2020 at 3:22 PM saksham sapra <sa...@gmail.com>
> wrote:
>
>> Thanks Till for helping out,
>>
>> The way you suggested, is it possible to copy libs which is in D
>> directory to FLINK_HOME/libs. I tried to run a copy command : copy
>> D:/data/libs to FLINK_HOME/libs and it gets copied but i dont how can i
>> check where it gets copied and this libs is taken by flink?
>>
>>
>> Thanks,
>> Saksham Sapra
>>
>> On Wed, Oct 7, 2020 at 9:40 PM Till Rohrmann <tr...@apache.org>
>> wrote:
>>
>>> HI Saksham,
>>>
>>> the easiest approach would probably be to include the required libraries
>>> in your user code jar which you submit to the cluster. Using maven's shade
>>> plugin should help with this task. Alternatively, you could also create a
>>> custom Flink Docker image where you add the required libraries to the
>>> FLINK_HOME/libs directory. This would however mean that every job you
>>> submit to the Flink cluster would see these libraries in the system class
>>> path.
>>>
>>> Cheers,
>>> Till
>>>
>>> On Wed, Oct 7, 2020 at 2:08 PM saksham sapra <sa...@gmail.com>
>>> wrote:
>>>
>>>> Hi ,
>>>>
>>>> i have made some configuration using this link page :
>>>> https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html
>>>> .
>>>> and i am able to run flink on UI , but i need to submit a job using :
>>>> http://localhost:8001/api/v1/namespaces/default/services/flink-jobmanager:webui/proxy/#/submit
>>>> through POstman, and i have some libraries which in local i can add in libs
>>>> folder but in this how can i add my libraries so that it works properly.
>>>>
>>>> [image: image.png]
>>>>
>>>

Re: Flink Kuberntes Libraries

Posted by superainbower <su...@163.com>.
Hi Till,
Could u tell me how to configure HDFS as statebackend when I deploy flink on k8s?
I try to add the following to flink-conf.yaml


state.backend: rocksdb
state.checkpoints.dir: hdfs://slave2:8020/flink/checkpoints
state.savepoints.dir: hdfs://slave2:8020/flink/savepoints
state.backend.incremental: true


And add flink-shaded-hadoop2-2.8.3-1.8.3.jar to /opt/flink/lib


But It doesn’t work and I got this error logs


Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded. For a full list of supported file systems, please seehttps://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/.


Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Cannot support file system for 'hdfs' via Hadoop, because Hadoop is not in the classpath, or some classes are missing from the classpath


Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.flink.runtime.util.HadoopUtils
On 10/09/2020 22:13, Till Rohrmann wrote:
Hi Saksham,


if you want to extend the Flink Docker image you can find here more details [1]. 


If you want to include the library in your user jar, then you have to add the library as a dependency to your pom.xml file and enable the shade plugin for building an uber jar [2].


[1] https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html#advanced-customization
[2] https://maven.apache.org/plugins/maven-shade-plugin/examples/includes-excludes.html


Cheers,
Till


On Fri, Oct 9, 2020 at 3:22 PM saksham sapra <sa...@gmail.com> wrote:

Thanks Till for helping out,


The way you suggested, is it possible to copy libs which is in D directory to FLINK_HOME/libs. I tried to run a copy command : copy D:/data/libs to FLINK_HOME/libs and it gets copied but i dont how can i check where it gets copied and this libs is taken by flink?




Thanks,
Saksham Sapra


On Wed, Oct 7, 2020 at 9:40 PM Till Rohrmann <tr...@apache.org> wrote:

HI Saksham,


the easiest approach would probably be to include the required libraries in your user code jar which you submit to the cluster. Using maven's shade plugin should help with this task. Alternatively, you could also create a custom Flink Docker image where you add the required libraries to the FLINK_HOME/libs directory. This would however mean that every job you submit to the Flink cluster would see these libraries in the system class path.


Cheers,
Till


On Wed, Oct 7, 2020 at 2:08 PM saksham sapra <sa...@gmail.com> wrote:

Hi ,


i have made some configuration using this link page :https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html.
and i am able to run flink on UI , but i need to submit a job using : http://localhost:8001/api/v1/namespaces/default/services/flink-jobmanager:webui/proxy/#/submit through POstman, and i have some libraries which in local i can add in libs folder but in this how can i add my libraries so that it works properly.




Re: Flink Kuberntes Libraries

Posted by Till Rohrmann <tr...@apache.org>.
Hi Saksham,

if you want to extend the Flink Docker image you can find here more details
[1].

If you want to include the library in your user jar, then you have to add
the library as a dependency to your pom.xml file and enable the shade
plugin for building an uber jar [2].

[1]
https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html#advanced-customization
[2]
https://maven.apache.org/plugins/maven-shade-plugin/examples/includes-excludes.html

Cheers,
Till

On Fri, Oct 9, 2020 at 3:22 PM saksham sapra <sa...@gmail.com>
wrote:

> Thanks Till for helping out,
>
> The way you suggested, is it possible to copy libs which is in D directory
> to FLINK_HOME/libs. I tried to run a copy command : copy D:/data/libs to
> FLINK_HOME/libs and it gets copied but i dont how can i check where it gets
> copied and this libs is taken by flink?
>
>
> Thanks,
> Saksham Sapra
>
> On Wed, Oct 7, 2020 at 9:40 PM Till Rohrmann <tr...@apache.org> wrote:
>
>> HI Saksham,
>>
>> the easiest approach would probably be to include the required libraries
>> in your user code jar which you submit to the cluster. Using maven's shade
>> plugin should help with this task. Alternatively, you could also create a
>> custom Flink Docker image where you add the required libraries to the
>> FLINK_HOME/libs directory. This would however mean that every job you
>> submit to the Flink cluster would see these libraries in the system class
>> path.
>>
>> Cheers,
>> Till
>>
>> On Wed, Oct 7, 2020 at 2:08 PM saksham sapra <sa...@gmail.com>
>> wrote:
>>
>>> Hi ,
>>>
>>> i have made some configuration using this link page :
>>> https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html
>>> .
>>> and i am able to run flink on UI , but i need to submit a job using :
>>> http://localhost:8001/api/v1/namespaces/default/services/flink-jobmanager:webui/proxy/#/submit
>>> through POstman, and i have some libraries which in local i can add in libs
>>> folder but in this how can i add my libraries so that it works properly.
>>>
>>> [image: image.png]
>>>
>>

Re: Flink Kuberntes Libraries

Posted by Till Rohrmann <tr...@apache.org>.
HI Saksham,

the easiest approach would probably be to include the required libraries in
your user code jar which you submit to the cluster. Using maven's shade
plugin should help with this task. Alternatively, you could also create a
custom Flink Docker image where you add the required libraries to the
FLINK_HOME/libs directory. This would however mean that every job you
submit to the Flink cluster would see these libraries in the system class
path.

Cheers,
Till

On Wed, Oct 7, 2020 at 2:08 PM saksham sapra <sa...@gmail.com>
wrote:

> Hi ,
>
> i have made some configuration using this link page :
> https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html
> .
> and i am able to run flink on UI , but i need to submit a job using :
> http://localhost:8001/api/v1/namespaces/default/services/flink-jobmanager:webui/proxy/#/submit
> through POstman, and i have some libraries which in local i can add in libs
> folder but in this how can i add my libraries so that it works properly.
>
> [image: image.png]
>