You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Wei Zhong (Jira)" <ji...@apache.org> on 2020/04/02 09:01:00 UTC

[jira] [Created] (FLINK-16943) Support adding jars in PyFlink

Wei Zhong created FLINK-16943:
---------------------------------

             Summary: Support adding jars in PyFlink
                 Key: FLINK-16943
                 URL: https://issues.apache.org/jira/browse/FLINK-16943
             Project: Flink
          Issue Type: Improvement
          Components: API / Python
            Reporter: Wei Zhong


Since flink-1.10.0 released, many users have complained that PyFlink is inconvenient when loading external jar packages. For local execution, users need to copy the jar files to the lib folder under the installation directory of PyFlink, which is hard to locate. For job submission, users need to merge their jars into one, as `flink run` only accepts one jar file. It may be easy for Java users but difficult for Python users if they haven't touched Java.

We intend to add a `add_jars` interface on PyFlink TableEnvironment to solve this problem. It will add the jars to the context classloader of Py4j gateway server and add to the `PipelineOptions.JARS` of the configuration of StreamExecutionEnviornment/ExecutionEnviornment.

Via this interface, users could add jars in their python job. The jars will be loaded immediately, and users could use it even on the next line of the Python code. Submitting a job with multiple external jars won't be a problem anymore because all the jars in `PipelineOptions.JARS` will be added to the JobGraph and upload to the cluster.

As it seems not a big change I'm not sure whether it is necessary to create a FLIP to discuss this. So I created a JIRA first for flexibility. What do you think guys?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)