You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by hx...@apache.org on 2021/12/01 02:49:55 UTC

[flink-ml] branch master updated: [FLINK-24933][python] Support ML Python API to implement FLIP-173 and FLP-174

This is an automated email from the ASF dual-hosted git repository.

hxb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-ml.git


The following commit(s) were added to refs/heads/master by this push:
     new 835d8d0  [FLINK-24933][python] Support ML Python API to implement FLIP-173 and FLP-174
835d8d0 is described below

commit 835d8d0473eb8a948de748c6d22127d6461142b1
Author: huangxingbo <hx...@gmail.com>
AuthorDate: Wed Nov 17 15:58:13 2021 +0800

    [FLINK-24933][python] Support ML Python API to implement FLIP-173 and FLP-174
    
    This closes #36.
---
 .gitignore                                         |   3 +
 flink-ml-python/MANIFEST.in                        |  21 ++
 flink-ml-python/README.md                          |  20 ++
 flink-ml-python/dev/dev-requirements.txt           |  20 ++
 flink-ml-python/pyflink/examples/ml/__init__.py    |  17 ++
 flink-ml-python/pyflink/ml/__init__.py             |  17 ++
 flink-ml-python/pyflink/ml/core/__init__.py        |  17 ++
 flink-ml-python/pyflink/ml/core/api.py             | 115 +++++++
 flink-ml-python/pyflink/ml/core/builder.py         | 131 ++++++++
 flink-ml-python/pyflink/ml/core/param.py           | 338 +++++++++++++++++++++
 flink-ml-python/pyflink/ml/lib/__init__.py         |  17 ++
 flink-ml-python/pyflink/ml/tests/__init__.py       |  25 ++
 flink-ml-python/pyflink/ml/tests/test_pipeline.py  |  93 ++++++
 flink-ml-python/pyflink/ml/tests/test_stage.py     | 200 ++++++++++++
 flink-ml-python/pyflink/ml/tests/test_utils.py     |  34 +++
 flink-ml-python/pyflink/ml/util/__init__.py        |  17 ++
 .../pyflink/ml/util/read_write_utils.py            | 160 ++++++++++
 flink-ml-python/pyflink/ml/version.py              |  23 ++
 flink-ml-python/setup.cfg                          |  19 ++
 flink-ml-python/setup.py                           | 120 ++++++++
 20 files changed, 1407 insertions(+)

diff --git a/.gitignore b/.gitignore
index afd1e95..69b32b4 100644
--- a/.gitignore
+++ b/.gitignore
@@ -16,3 +16,6 @@ target
 .DS_Store
 *.ipr
 *.iws
+*.pyc
+flink-ml-python/dist/
+flink-ml-python/apache_flink_ml.egg-info/
diff --git a/flink-ml-python/MANIFEST.in b/flink-ml-python/MANIFEST.in
new file mode 100644
index 0000000..2635bfb
--- /dev/null
+++ b/flink-ml-python/MANIFEST.in
@@ -0,0 +1,21 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+global-exclude *.py[cod] __pycache__ .DS_Store
+recursive-include deps/examples *.py
+include README.md
diff --git a/flink-ml-python/README.md b/flink-ml-python/README.md
new file mode 100644
index 0000000..a68c576
--- /dev/null
+++ b/flink-ml-python/README.md
@@ -0,0 +1,20 @@
+Flink ML is a library which provides machine learning (ML) APIs and libraries that simplify the building of machine learning pipelines. It provides a set of standard ML APIs for MLlib developers to implement ML algorithms, as well as libraries of ML algorithms that can be used to build ML pipelines for both training and inference jobs.
+
+Flink ML is developed under the umbrella of [Apache Flink](https://flink.apache.org/).
+
+## <a name="build"></a>Python Packaging
+
+Prerequisites for building apache-flink-ml:
+
+* Unix-like environment (we use Linux, Mac OS X)
+* Python version(3.6, 3.7 or 3.8) is required
+
+Then go to the root directory of flink-ml-python source code and run this command to build the sdist package of apache-flink-ml:
+```bash
+cd flink-ml-python; python setup.py sdist;
+```
+
+The sdist package of apache-flink-ml will be found under ./flink-ml-python/dist/. It could be used for installation, such as:
+```bash
+python -m pip install dist/*.tar.gz
+```
diff --git a/flink-ml-python/dev/dev-requirements.txt b/flink-ml-python/dev/dev-requirements.txt
new file mode 100755
index 0000000..464b314
--- /dev/null
+++ b/flink-ml-python/dev/dev-requirements.txt
@@ -0,0 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+setuptools>=18.0
+wheel
+apache-flink==1.14.0
+pandas>=1.0,<1.2.0
+jsonpickle==2.0.0
+cloudpickle==1.2.2
\ No newline at end of file
diff --git a/flink-ml-python/pyflink/examples/ml/__init__.py b/flink-ml-python/pyflink/examples/ml/__init__.py
new file mode 100644
index 0000000..65b48d4
--- /dev/null
+++ b/flink-ml-python/pyflink/examples/ml/__init__.py
@@ -0,0 +1,17 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
diff --git a/flink-ml-python/pyflink/ml/__init__.py b/flink-ml-python/pyflink/ml/__init__.py
new file mode 100644
index 0000000..65b48d4
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/__init__.py
@@ -0,0 +1,17 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
diff --git a/flink-ml-python/pyflink/ml/core/__init__.py b/flink-ml-python/pyflink/ml/core/__init__.py
new file mode 100644
index 0000000..65b48d4
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/core/__init__.py
@@ -0,0 +1,17 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
diff --git a/flink-ml-python/pyflink/ml/core/api.py b/flink-ml-python/pyflink/ml/core/api.py
new file mode 100644
index 0000000..541dffc
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/core/api.py
@@ -0,0 +1,115 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+from abc import ABC, abstractmethod
+from typing import TypeVar, List, Generic
+
+from pyflink.datastream import StreamExecutionEnvironment
+from pyflink.table import Table
+
+from pyflink.ml.core.param import WithParams
+
+T = TypeVar('T')
+E = TypeVar('E')
+M = TypeVar('M')
+
+
+class Stage(WithParams[T], ABC):
+    """
+    Base class for a node in a Pipeline or Graph. The interface is only a concept, and does not have
+    any actual functionality. Its subclasses could be Estimator, Model, Transformer or AlgoOperator.
+    No other classes should inherit this interface directly.
+
+    Each stage is with parameters, and requires a public empty constructor for restoration.
+    """
+
+    @abstractmethod
+    def save(self, path: str) -> None:
+        """
+        Saves this stage to the given path.
+        """
+        pass
+
+    @classmethod
+    @abstractmethod
+    def load(cls, env: StreamExecutionEnvironment, path: str) -> T:
+        """
+        Instantiates a new stage instance based on the data read from the given path.
+        """
+        pass
+
+
+class AlgoOperator(Stage[T], ABC):
+    """
+    An AlgoOperator takes a list of tables as inputs and produces a list of tables as results. It
+    can be used to encode generic multi-input multi-output computation logic.
+    """
+
+    @abstractmethod
+    def transform(self, *inputs: Table) -> List[Table]:
+        """
+        Applies the AlgoOperator on the given input tables and returns the result tables.
+
+        :param inputs: A list of tables.
+        :return: A list of tables.
+        """
+        pass
+
+
+class Transformer(AlgoOperator[T], ABC):
+    """
+    A Transformer is an AlgoOperator with the semantic difference that it encodes the Transformation
+    logic, such that a record in the output typically corresponds to one record in the input. In
+    contrast, an AlgoOperator is a better fit to express aggregation logic where a record in the
+    output could be computed from an arbitrary number of records in the input.
+    """
+    pass
+
+
+class Model(Transformer[T], ABC):
+    """
+    A Model is typically generated by invoking :func:`~Estimator.fit`. A Model is a Transformer with
+    the extra APIs to set and get model data.
+    """
+
+    def set_model_data(self, *inputs: Table) -> None:
+        raise Exception("This operation is not supported.")
+
+    def get_model_data(self) -> None:
+        """
+        Gets a list of tables representing the model data. Each table could be an unbounded stream
+        of model data changes.
+
+        :return: A list of tables.
+        """
+        raise Exception("This operation is not supported.")
+
+
+class Estimator(Generic[E, M], Stage[E], ABC):
+    """
+    Estimators are responsible for training and generating Models.
+    """
+
+    def fit(self, *inputs: Table) -> Model[M]:
+        """
+        Trains on the given inputs and produces a Model.
+
+        :param inputs: A list of tables.
+        :return: A Model.
+        """
+        pass
diff --git a/flink-ml-python/pyflink/ml/core/builder.py b/flink-ml-python/pyflink/ml/core/builder.py
new file mode 100644
index 0000000..cd27358
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/core/builder.py
@@ -0,0 +1,131 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+from typing import List, TypeVar
+
+from pyflink.datastream import StreamExecutionEnvironment
+from pyflink.table import Table
+
+from pyflink.ml.core.api import Estimator, Model, AlgoOperator, Stage
+
+E = TypeVar('E')
+
+
+class PipelineModel(Model):
+    """
+    A PipelineModel acts as a Model. It consists of an ordered list of stages, each of which could
+    be a Model, Transformer or AlgoOperator.
+    """
+
+    def __init__(self, stages: List[Stage]):
+        self._stages = stages
+        self._param_map = {}
+
+    def transform(self, *inputs: Table) -> List[Table]:
+        """
+        Applies all stages in this PipelineModel on the input tables in order. The output of one
+        stage is used as the input of the next stage (if any). The output of the last stage is
+        returned as the result of this method.
+
+        :param inputs: A list of tables.
+        :return: A list of tables.
+        """
+        for stage in self._stages:
+            if isinstance(stage, AlgoOperator):
+                inputs = stage.transform(*inputs)
+            else:
+                raise TypeError(f"The stage {stage} must be an AlgoOperator.")
+        return list(inputs)
+
+    def save(self, path: str) -> None:
+        from pyflink.ml.util import read_write_utils
+        read_write_utils.save_pipeline(self, self._stages, path)
+
+    @classmethod
+    def load(cls, env: StreamExecutionEnvironment, path: str) -> 'PipelineModel':
+        from pyflink.ml.util import read_write_utils
+        return PipelineModel(read_write_utils.load_pipeline(env, path))
+
+    def get_param_map(self):
+        return self._param_map
+
+
+class Pipeline(Estimator[E, PipelineModel]):
+    """
+    A Pipeline acts as an Estimator. It consists of an ordered list of stages, each of which could
+    be an Estimator, Model, Transformer or AlgoOperator.
+    """
+
+    def __init__(self, stages: List[Stage]):
+        self._stages = stages
+        self._param_map = {}
+
+    def fit(self, *inputs: Table) -> PipelineModel:
+        """
+        Trains the pipeline to fit on the given tables.
+
+        This method goes through all stages of this pipeline in order and does the following on
+        each stage until the last Estimator (inclusive).
+
+        <ul>
+            <li> If a stage is an Estimator, invoke :func:`~Estimator.fit` with the input
+                tables to generate a Model. And if there is Estimator after this stage, transform
+                the input tables using the generated Model to get result tables, then pass the
+                result tables to the next stage as inputs.
+            <li> If a stage is an AlgoOperator AND there is Estimator after this stage, transform
+                the input tables using this stage to get result tables, then pass the result tables
+                to the next stage as inputs.
+        </ul>
+
+        After all the Estimators are trained to fit their input tables, a new PipelineModel will
+        be created with the same stages in this pipeline, except that all the Estimators in the
+        PipelineModel are replaced with the models generated in the above process.
+
+        :param inputs: A list of tables.
+        :return: A PipelineModel.
+        """
+        last_estimator_idx = -1
+        for i, stage in enumerate(self._stages):
+            if isinstance(stage, Estimator):
+                last_estimator_idx = i
+
+        model_stages = []
+        last_inputs = inputs
+        for i, stage in enumerate(self._stages):
+            if isinstance(stage, Estimator):
+                model_stage = stage.fit(*last_inputs)
+            else:
+                model_stage = stage
+            model_stages.append(model_stage)
+
+            if i < last_estimator_idx:
+                last_inputs = model_stage.transform(*last_inputs)
+
+        return PipelineModel(model_stages)
+
+    def save(self, path: str) -> None:
+        from pyflink.ml.util import read_write_utils
+        read_write_utils.save_pipeline(self, self._stages, path)
+
+    @classmethod
+    def load(cls, env: StreamExecutionEnvironment, path: str) -> 'Pipeline':
+        from pyflink.ml.util import read_write_utils
+        return Pipeline(read_write_utils.load_pipeline(env, path))
+
+    def get_param_map(self):
+        return self._param_map
diff --git a/flink-ml-python/pyflink/ml/core/param.py b/flink-ml-python/pyflink/ml/core/param.py
new file mode 100644
index 0000000..91b59b6
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/core/param.py
@@ -0,0 +1,338 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+from abc import ABC, abstractmethod
+from typing import TypeVar, Generic, List, Dict, Any, Optional, Tuple, Union
+
+import jsonpickle
+
+T = TypeVar('T')
+V = TypeVar('V')
+
+
+class WithParams(Generic[T], ABC):
+    """
+    Interface for classes that take parameters. It provides APIs to set and get parameters.
+    """
+
+    def set(self, param: 'Param[V]', value: V) -> 'T':
+        """
+        Sets the value of the parameter.
+
+        :param param: The parameter.
+        :param value: The parameter value.
+        :return: The WithParams instance itself.
+        """
+        if not self._is_compatible_type(param, value):
+            raise TypeError(
+                f"Parameter {param.name}'s type {param.type} is incompatible with the type of "
+                f"{type(value)}")
+
+        if not param.validator.validate(value):
+            if value is None:
+                raise ValueError(f'Parameter {param.name}\'s value should not be None.')
+            else:
+                raise ValueError(f'Parameter {param.name} is given an invalid value {value}.')
+
+        self.get_param_map()[param] = value
+        return self
+
+    def get_param(self, name: str) -> Optional['Param[V]']:
+        """
+        Gets the parameter by its name.
+
+        :param name: The parameter name.
+        :return: The parameter.
+        """
+        for item in self.get_param_map():
+            if item.name == name:
+                return item
+        return None
+
+    def get(self, param: 'Param[V]') -> V:
+        """
+        Gets the value of the parameter.
+
+        :param param: The parameter.
+        :return: The parameter value.
+        """
+        value = self.get_param_map().get(param)
+        if value is None and not param.validator.validate(None):
+            raise ValueError(f'Parameter {param.name}\'s value should not be None')
+        return value
+
+    @abstractmethod
+    def get_param_map(self) -> Dict['Param[Any]', Any]:
+        """
+        Returns a map which maps parameter definition to parameter value.
+        """
+        pass
+
+    @staticmethod
+    def _is_compatible_type(param: 'Param[V]', value: V) -> bool:
+        if value is not None and param.type != type(value):
+            return False
+        if isinstance(value, list):
+            for item in value:
+                if param.type_name != f'list[{type(item).__name__}]':
+                    return False
+            return True
+        return True
+
+
+class ParamValidator(Generic[T], ABC):
+    """
+    An interface to validate that a parameter value is valid.
+    """
+
+    @abstractmethod
+    def validate(self, value: T) -> bool:
+        """
+        Validates whether the parameter value is valid.
+
+        :param value: The parameter value.
+        :return: A boolean indicating whether the parameter value is valid.
+        """
+        pass
+
+
+class ParamValidators(object):
+    """
+    Factory methods for common validation functions on numerical values.
+    """
+
+    @staticmethod
+    def always_true() -> ParamValidator[T]:
+        class AlwaysTrue(ParamValidator[T]):
+            """
+            Always return true.
+            """
+
+            def validate(self, value: T) -> bool:
+                return True
+
+        return AlwaysTrue()
+
+    @staticmethod
+    def gt(lower_bound: float) -> ParamValidator[T]:
+        class GT(ParamValidator[T]):
+            """
+            Checks if the parameter value is greater than lower_bound.
+            """
+
+            def validate(self, value: T) -> bool:
+                return value is not None and value > lower_bound
+
+        return GT()
+
+    @staticmethod
+    def gt_eq(lower_bound: float) -> ParamValidator[T]:
+        class GtEq(ParamValidator[T]):
+            """
+            Checks if the parameter value is greater than or equal to lower_bound.
+            """
+
+            def validate(self, value: T) -> bool:
+                return value is not None and value >= lower_bound
+
+        return GtEq()
+
+    @staticmethod
+    def lt(upper_bound: float) -> ParamValidator[T]:
+        class LT(ParamValidator[T]):
+            """
+            Checks if the parameter value is less than upper_bound.
+            """
+
+            def validate(self, value: T) -> bool:
+                return value is not None and value < upper_bound
+
+        return LT()
+
+    @staticmethod
+    def lt_eq(upper_bound: float) -> ParamValidator[T]:
+        """
+         Checks if the parameter value is less than or equal to upper_bound.
+         """
+
+        class LtEq(ParamValidator[T]):
+            def validate(self, value: T) -> bool:
+                return value is not None and value <= upper_bound
+
+        return LtEq()
+
+    @staticmethod
+    def in_range(lower_bound: float, upper_bound: float, lower_inclusive: bool = True,
+                 upper_inclusive: bool = True) -> ParamValidator[T]:
+        """
+        Checks if the parameter value is in the range from lower_bound to upper_bound.
+        """
+
+        class InRange(ParamValidator[T]):
+            def validate(self, value: T) -> bool:
+                return (value is not None
+                        and lower_bound <= value <= upper_bound
+                        and (lower_inclusive or value != lower_bound)
+                        and (upper_inclusive or value != upper_bound))
+
+        return InRange()
+
+    @staticmethod
+    def in_array(allowed: Union[Tuple[T], List[T]]) -> ParamValidator[T]:
+        """
+        Checks if the parameter value is in the array of allowed values.
+        """
+
+        class InArray(ParamValidator[T]):
+            def validate(self, value: T) -> bool:
+                return value is not None and value in allowed
+
+        return InArray()
+
+    @staticmethod
+    def not_null() -> ParamValidator[T]:
+        """
+        Checks if the parameter value is not None.
+        """
+
+        class NotNull(ParamValidator[T]):
+            def validate(self, value: T) -> bool:
+                return value is not None
+
+        return NotNull()
+
+
+class Param(Generic[T]):
+    """
+    Definition of a parameter, including name, description, default value and the validator.
+    """
+
+    def __init__(self, name: str, type_type: type, type_name: str, description: str,
+                 default_value: T, validator: ParamValidator[T]):
+        self.name = name
+        self.type = type_type
+        self.type_name = type_name
+        self.description = description
+        self.default_value = default_value
+        self.validator = validator
+        if default_value is not None and not validator.validate(default_value):
+            raise ValueError(f"Parameter {name} is given an invalid value {default_value}")
+
+    @staticmethod
+    def json_encode(value: T) -> str:
+        """
+        Encodes the given object into a json-formatted string.
+
+        :param value: An object of class type T.
+        :return: A json-formatted string.
+        """
+        return str(jsonpickle.encode(value, keys=True))
+
+    @staticmethod
+    def json_decode(json: str) -> T:
+        """
+        Decodes the given string into an object of class type T.
+
+        :param json: A json-formatted string.
+        :return: An object of class type T.
+        """
+        return jsonpickle.decode(json, keys=True)
+
+    def __eq__(self, other):
+        return isinstance(other, Param) and self.name == other.name
+
+    def __hash__(self):
+        return hash(self.name)
+
+    def __str__(self):
+        return self.name
+
+
+class BooleanParam(Param[bool]):
+    """
+    Class for the boolean parameter.
+    """
+
+    def __init__(self, name: str, description: str, default_value: Optional[bool],
+                 validator: ParamValidator[bool] = ParamValidators.always_true()):
+        super(BooleanParam, self).__init__(name, bool, "bool", description, default_value,
+                                           validator)
+
+
+class IntParam(Param[int]):
+    """
+    Class for the int parameter.
+    """
+
+    def __init__(self, name: str, description: str, default_value: Optional[int],
+                 validator: ParamValidator[int] = ParamValidators.always_true()):
+        super(IntParam, self).__init__(name, int, "int", description, default_value, validator)
+
+
+class FloatParam(Param[float]):
+    """
+    Class for the float parameter.
+    """
+
+    def __init__(self, name: str, description: str, default_value: Optional[float],
+                 validator: ParamValidator[float] = ParamValidators.always_true()):
+        super(FloatParam, self).__init__(name, float, "float", description, default_value,
+                                         validator)
+
+
+class StringParam(Param[str]):
+    """
+    Class for the string parameter.
+    """
+
+    def __init__(self, name: str, description: str, default_value: Optional[str],
+                 validator: ParamValidator[str] = ParamValidators.always_true()):
+        super(StringParam, self).__init__(name, str, "str", description, default_value, validator)
+
+
+class IntArrayParam(Param[List[int]]):
+    """
+    Class for the string parameter.
+    """
+
+    def __init__(self, name: str, description: str, default_value: Optional[List[int]],
+                 validator: ParamValidator[List[int]] = ParamValidators.always_true()):
+        super(IntArrayParam, self).__init__(name, list, "list[int]", description, default_value,
+                                            validator)
+
+
+class FloatArrayParam(Param[List[float]]):
+    """
+    Class for the string parameter.
+    """
+
+    def __init__(self, name: str, description: str, default_value: Optional[List[float]],
+                 validator: ParamValidator[List[float]] = ParamValidators.always_true()):
+        super(FloatArrayParam, self).__init__(name, list, "list[float]", description, default_value,
+                                              validator)
+
+
+class StringArrayParam(Param[List[str]]):
+    """
+    Class for the string array parameter.
+    """
+
+    def __init__(self, name: str, description: str, default_value: Optional[List[str]],
+                 validator: ParamValidator[List[str]] = ParamValidators.always_true()):
+        super(StringArrayParam, self).__init__(name, list, "list[str]", description, default_value,
+                                               validator)
diff --git a/flink-ml-python/pyflink/ml/lib/__init__.py b/flink-ml-python/pyflink/ml/lib/__init__.py
new file mode 100644
index 0000000..65b48d4
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/lib/__init__.py
@@ -0,0 +1,17 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
diff --git a/flink-ml-python/pyflink/ml/tests/__init__.py b/flink-ml-python/pyflink/ml/tests/__init__.py
new file mode 100644
index 0000000..1ffa1a2
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/tests/__init__.py
@@ -0,0 +1,25 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+import sys
+from pathlib import Path
+
+# Because the project and the dependent `pyflink` project have the same directory structure,
+# we need to manually add `flink-ml-python` path to `sys.path` in the test of this project to change
+# the order of package search.
+flink_ml_python_dir = Path(__file__).parents[3]
+sys.path.append(str(flink_ml_python_dir))
diff --git a/flink-ml-python/pyflink/ml/tests/test_pipeline.py b/flink-ml-python/pyflink/ml/tests/test_pipeline.py
new file mode 100644
index 0000000..99cd9ff
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/tests/test_pipeline.py
@@ -0,0 +1,93 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+import os
+from typing import Dict, Any, List
+
+import pandas as pd
+from pandas._testing import assert_frame_equal
+from pyflink.datastream import StreamExecutionEnvironment
+from pyflink.table import Table
+
+from pyflink.ml.core.api import Model
+from pyflink.ml.core.builder import PipelineModel, Pipeline
+from pyflink.ml.core.param import Param
+from pyflink.ml.tests.test_utils import PyFlinkMLTestCase
+
+
+class PipelineTest(PyFlinkMLTestCase):
+
+    def test_pipeline_model(self):
+        input_table = self.t_env.from_elements([(1,), (2,), (3,)], ['a'])
+
+        model_a = Add10Model()
+        model_b = Add10Model()
+        model_c = Add10Model()
+        model = PipelineModel([model_a, model_b, model_c])
+        output_table = model.transform(input_table)[0]
+
+        assert_frame_equal(output_table.to_pandas(),
+                           pd.DataFrame([[31], [32], [33]], columns=['a']))
+
+        # Saves and loads the PipelineModel.
+        path = os.path.join(self.temp_dir, "test_pipeline_model")
+        model.save(path)
+        loaded_model = PipelineModel.load(self.env, path)
+
+        output_table2 = loaded_model.transform(input_table)[0]
+        assert_frame_equal(output_table2.to_pandas(),
+                           pd.DataFrame([[31], [32], [33]], columns=['a']))
+
+    def test_pipeline(self):
+        input_table = self.t_env.from_elements([(1,), (2,), (3,)], ['a'])
+
+        model_a = Add10Model()
+        model_b = Add10Model()
+        estimator = Pipeline([model_a, model_b])
+        model = estimator.fit(input_table)
+        output_table = model.transform(input_table)[0]
+
+        assert_frame_equal(output_table.to_pandas(),
+                           pd.DataFrame([[21], [22], [23]], columns=['a']))
+
+        # Saves and loads the PipelineModel.
+        path = os.path.join(self.temp_dir, "test_pipeline")
+        estimator.save(path)
+        loaded_estimator = Pipeline.load(self.env, path)
+
+        model = loaded_estimator.fit(input_table)
+        output_table = model.transform(input_table)[0]
+        assert_frame_equal(output_table.to_pandas(),
+                           pd.DataFrame([[21], [22], [23]], columns=['a']))
+
+
+class Add10Model(Model):
+    def transform(self, *inputs: Table) -> List[Table]:
+        assert len(inputs) == 1
+        return [inputs[0].select("a + 10 as a")]
+
+    def save(self, path: str) -> None:
+        from pyflink.ml.util import read_write_utils
+        read_write_utils.save_metadata(self, path)
+
+    @classmethod
+    def load(cls, env: StreamExecutionEnvironment, path: str):
+        from pyflink.ml.util import read_write_utils
+        return read_write_utils.load_stage_param(path)
+
+    def get_param_map(self) -> Dict['Param[Any]', Any]:
+        return {}
diff --git a/flink-ml-python/pyflink/ml/tests/test_stage.py b/flink-ml-python/pyflink/ml/tests/test_stage.py
new file mode 100644
index 0000000..addf260
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/tests/test_stage.py
@@ -0,0 +1,200 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+import os
+from typing import Dict, Any
+
+from pyflink.datastream import StreamExecutionEnvironment
+
+from pyflink.ml.core.api import Stage
+from pyflink.ml.core.param import ParamValidators, Param, BooleanParam, IntParam, \
+    FloatParam, StringParam, IntArrayParam, FloatArrayParam, StringArrayParam
+from pyflink.ml.tests.test_utils import PyFlinkMLTestCase
+
+BOOLEAN_PARAM = BooleanParam("boolean_param", "Description", False)
+INT_PARAM = IntParam("int_param", "Description", 1, ParamValidators.lt(100))
+FLOAT_PARAM = FloatParam("float_param", "Description", 3.0, ParamValidators.lt(100))
+STRING_PARAM = StringParam('string_param', "Description", "5")
+INT_ARRAY_PARAM = IntArrayParam("int_array_param", "Description", [6, 7])
+FLOAT_ARRAY_PARAM = FloatArrayParam("float_array_param", "Description", [10.0, 11.0])
+STRING_ARRAY_PARAM = StringArrayParam("string_array_param", "Description", ["14", "15"])
+EXTRA_INT_PARAM = IntParam("extra_int_param",
+                           "Description",
+                           20,
+                           ParamValidators.always_true())
+PARAM_WITH_NONE_DEFAULT = IntParam("param_with_none_default",
+                                   "Must be explicitly set with a non-none value",
+                                   None,
+                                   ParamValidators.not_null())
+
+
+class StageTest(PyFlinkMLTestCase):
+
+    def test_param_set_value_with_name(self):
+        stage = MyStage()
+        stage.set(INT_PARAM, 2)
+        self.assertEqual(2, stage.get(INT_PARAM))
+
+        param = stage.get_param("int_param")
+        stage.set(param, 3)
+        self.assertEqual(3, stage.get(param))
+
+        param = stage.get_param('extra_int_param')
+        stage.set(param, 50)
+        self.assertEqual(50, stage.get(param))
+
+    def test_param_with_null_default(self):
+        stage = MyStage()
+        import pytest
+        with pytest.raises(ValueError, match='value should not be None'):
+            stage.get(PARAM_WITH_NONE_DEFAULT)
+
+        stage.set(PARAM_WITH_NONE_DEFAULT, 3)
+        self.assertEqual(3, stage.get(PARAM_WITH_NONE_DEFAULT))
+
+    def test_param_set_invalid_value(self):
+        stage = MyStage()
+        import pytest
+
+        with pytest.raises(ValueError, match='Parameter int_param is given an invalid value 100.'):
+            stage.set(INT_PARAM, 100)
+
+        with pytest.raises(ValueError,
+                           match='Parameter float_param is given an invalid value 100.0.'):
+            stage.set(FLOAT_PARAM, 100.0)
+
+        with pytest.raises(TypeError,
+                           match="Parameter int_param's type <class 'int'> is incompatible with "
+                                 "the type of <class 'str'>"):
+            stage.set(INT_PARAM, "100")
+
+        with pytest.raises(TypeError,
+                           match="Parameter string_param's type <class 'str'> is incompatible with"
+                                 " the type of <class 'int'>"):
+            stage.set(STRING_PARAM, 100)
+
+    def test_param_set_valid_value(self):
+        stage = MyStage()
+
+        stage.set(BOOLEAN_PARAM, True)
+        self.assertTrue(stage.get(BOOLEAN_PARAM))
+
+        stage.set(INT_PARAM, 50)
+        self.assertEqual(50, stage.get(INT_PARAM))
+
+        stage.set(FLOAT_PARAM, 50.0)
+        self.assertEqual(50.0, stage.get(FLOAT_PARAM))
+
+        stage.set(STRING_PARAM, "50")
+        self.assertEqual("50", stage.get(STRING_PARAM))
+
+        stage.set(INT_ARRAY_PARAM, [50, 51])
+        self.assertEqual([50, 51], stage.get(INT_ARRAY_PARAM))
+
+        stage.set(FLOAT_ARRAY_PARAM, [50.0, 51.0])
+        self.assertEqual([50.0, 51.0], stage.get(FLOAT_ARRAY_PARAM))
+
+        stage.set(STRING_ARRAY_PARAM, ["50", "51"])
+        self.assertEqual(["50", "51"], stage.get(STRING_ARRAY_PARAM))
+
+    def test_stage_save_load(self):
+        stage = MyStage()
+        stage.set(PARAM_WITH_NONE_DEFAULT, 1)
+        path = os.path.join(self.temp_dir, "test_stage_save_load")
+        stage.save(path)
+        loaded_stage = MyStage.load(self.env, path)
+        self.assertEqual(stage.get_param_map(), loaded_stage.get_param_map())
+        self.assertEqual(1, loaded_stage.get(INT_PARAM))
+
+    def test_validators(self):
+        gt = ParamValidators.gt(10)
+        self.assertFalse(gt.validate(None))
+        self.assertFalse(gt.validate(5))
+        self.assertFalse(gt.validate(10))
+        self.assertTrue(gt.validate(15))
+
+        gt_eq = ParamValidators.gt_eq(10)
+        self.assertFalse(gt_eq.validate(None))
+        self.assertFalse(gt_eq.validate(5))
+        self.assertTrue(gt_eq.validate(10))
+        self.assertTrue(gt_eq.validate(15))
+
+        lt = ParamValidators.lt(10)
+        self.assertFalse(lt.validate(None))
+        self.assertTrue(lt.validate(5))
+        self.assertFalse(lt.validate(10))
+        self.assertFalse(lt.validate(15))
+
+        lt_eq = ParamValidators.lt_eq(10)
+        self.assertFalse(lt_eq.validate(None))
+        self.assertTrue(lt_eq.validate(5))
+        self.assertTrue(lt_eq.validate(10))
+        self.assertFalse(lt_eq.validate(15))
+
+        in_range_inclusive = ParamValidators.in_range(5, 15)
+        self.assertFalse(in_range_inclusive.validate(None))
+        self.assertFalse(in_range_inclusive.validate(0))
+        self.assertTrue(in_range_inclusive.validate(5))
+        self.assertTrue(in_range_inclusive.validate(10))
+        self.assertTrue(in_range_inclusive.validate(15))
+        self.assertFalse(in_range_inclusive.validate(20))
+
+        in_range_exclusive = ParamValidators.in_range(5, 15, False, False)
+        self.assertFalse(in_range_exclusive.validate(None))
+        self.assertFalse(in_range_exclusive.validate(0))
+        self.assertFalse(in_range_exclusive.validate(5))
+        self.assertTrue(in_range_exclusive.validate(10))
+        self.assertFalse(in_range_exclusive.validate(15))
+        self.assertFalse(in_range_exclusive.validate(20))
+
+        in_array = ParamValidators.in_array([1, 2, 3])
+        self.assertFalse(in_array.validate(None))
+        self.assertTrue(in_array.validate(1))
+        self.assertFalse(in_array.validate(0))
+
+        not_null = ParamValidators.not_null()
+        self.assertTrue(not_null.validate(5))
+        self.assertFalse(not_null.validate(None))
+
+
+class MyStage(Stage):
+    def __init__(self):
+        self._param_map = {}  # type: Dict[Param, Any]
+        self._init_param()
+
+    def save(self, path: str) -> None:
+        from pyflink.ml.util import read_write_utils
+        read_write_utils.save_metadata(self, path)
+
+    @classmethod
+    def load(cls, env: StreamExecutionEnvironment, path: str):
+        from pyflink.ml.util import read_write_utils
+        return read_write_utils.load_stage_param(path)
+
+    def get_param_map(self) -> Dict['Param[Any]', Any]:
+        return self._param_map
+
+    def _init_param(self):
+        self._param_map[BOOLEAN_PARAM] = BOOLEAN_PARAM.default_value
+        self._param_map[INT_PARAM] = INT_PARAM.default_value
+        self._param_map[FLOAT_PARAM] = FLOAT_PARAM.default_value
+        self._param_map[STRING_PARAM] = STRING_PARAM.default_value
+        self._param_map[INT_ARRAY_PARAM] = INT_ARRAY_PARAM.default_value
+        self._param_map[FLOAT_ARRAY_PARAM] = FLOAT_ARRAY_PARAM.default_value
+        self._param_map[STRING_ARRAY_PARAM] = STRING_ARRAY_PARAM.default_value
+        self._param_map[EXTRA_INT_PARAM] = EXTRA_INT_PARAM.default_value
+        self._param_map[PARAM_WITH_NONE_DEFAULT] = PARAM_WITH_NONE_DEFAULT.default_value
diff --git a/flink-ml-python/pyflink/ml/tests/test_utils.py b/flink-ml-python/pyflink/ml/tests/test_utils.py
new file mode 100644
index 0000000..dce68cd
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/tests/test_utils.py
@@ -0,0 +1,34 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+import shutil
+import tempfile
+import unittest
+
+from pyflink.datastream import StreamExecutionEnvironment
+from pyflink.table import StreamTableEnvironment
+
+
+class PyFlinkMLTestCase(unittest.TestCase):
+    def setUp(self):
+        self.env = StreamExecutionEnvironment.get_execution_environment()
+        self.t_env = StreamTableEnvironment.create(self.env)
+        self.t_env.get_config().get_configuration().set_string("parallelism.default", "2")
+        self.temp_dir = tempfile.mkdtemp()
+
+    def tearDown(self) -> None:
+        shutil.rmtree(self.temp_dir, ignore_errors=True)
diff --git a/flink-ml-python/pyflink/ml/util/__init__.py b/flink-ml-python/pyflink/ml/util/__init__.py
new file mode 100644
index 0000000..65b48d4
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/util/__init__.py
@@ -0,0 +1,17 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
diff --git a/flink-ml-python/pyflink/ml/util/read_write_utils.py b/flink-ml-python/pyflink/ml/util/read_write_utils.py
new file mode 100644
index 0000000..059b744
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/util/read_write_utils.py
@@ -0,0 +1,160 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+import os
+import pickle
+import time
+from importlib import import_module
+from typing import List, Dict, Any
+
+import cloudpickle
+from pyflink.datastream import StreamExecutionEnvironment
+
+from pyflink.ml.core.api import Stage
+
+
+def save_pipeline(pipeline: Stage, stages: List[Stage], path: str) -> None:
+    """
+    Saves a Pipeline or PipelineModel with the given list of stages to the given path.
+
+    :param pipeline: A Pipeline or PipelineModel instance.
+    :param stages: A list of stages of the given pipeline.
+    :param path: The parent directory to save the pipeline metadata and its stages.
+    """
+    # Creates parent directories if not already created.
+    os.makedirs(path, exist_ok=True)
+
+    extra_metadata = {'num_stages': len(stages)}
+    save_metadata(pipeline, path, extra_metadata)
+
+    num_stages = len(stages)
+    for i, stage in enumerate(stages):
+        stage_path = get_path_for_pipeline_stage(i, num_stages, path)
+        stage.save(stage_path)
+
+
+def load_pipeline(env: StreamExecutionEnvironment, path: str) -> List[Stage]:
+    """
+    Loads the stages of a Pipeline or PipelineModel from the given path.
+
+    :param env: A StreamExecutionEnvironment instance.
+    :param path: A StreamExecutionEnvironment instance.
+    :return: A list of stages.
+    """
+    meta_data = load_metadata(path)
+    num_stages = meta_data['num_stages']
+    return [load_stage(env, get_path_for_pipeline_stage(i, num_stages, path))
+            for i in range(num_stages)]
+
+
+def save_metadata(stage: Stage, path: str, extra_metadata=None) -> None:
+    """
+    Saves the metadata of the given stage and the extra metadata to a file named `metadata` under
+    the given path. The metadata of a stage includes the stage class name, parameter values etc.
+
+    Required: the metadata file under the given path should not exist.
+
+    :param stage: The stage instance.
+    :param path: The parent directory to save the stage metadata.
+    :param extra_metadata: The extra metadata to be saved.
+    """
+    if extra_metadata is None:
+        extra_metadata = {}
+    os.makedirs(path, exist_ok=True)
+
+    metadata = {k: v for k, v in extra_metadata.items()}
+    metadata['module_name'] = str(stage.__module__)
+    metadata['class_name'] = str(type(stage).__name__)
+    metadata['timestamp'] = time.time()
+    metadata['param_map'] = {cloudpickle.dumps(k): k.json_encode(v)
+                             for k, v in stage.get_param_map().items()}
+    # TODO: add version in the metadata.
+
+    metadata_bytes = pickle.dumps(metadata)
+    metadata_path = os.path.join(path, 'metadata')
+    if os.path.isfile(metadata_path):
+        raise IOError(f'File {metadata_path} already exists.')
+    with open(metadata_path, 'wb') as fd:
+        fd.write(metadata_bytes)
+
+
+def load_metadata(path: str) -> Dict[str, Any]:
+    """
+    Loads the metadata from the metadata file under the given path.
+
+    :param path: The parent directory of the metadata file to read from.
+    :return: A Dict from metadata name to metadata value.
+    """
+    metadata_path = os.path.join(path, "metadata")
+    with open(metadata_path, 'rb') as fd:
+        metadata_bytes = fd.read()
+    meta_data = pickle.loads(metadata_bytes)
+    return meta_data
+
+
+def load_stage(env: StreamExecutionEnvironment, path: str) -> Stage:
+    """
+    Loads the stage from the given path by invoking the static load() method of the stage. The
+    stage module name and class name are read from the metadata file under the given path. The
+    load() method is expected to construct the stage instance with the saved parameters, model data
+    and other metadata if exists.
+
+    :param env: A StreamExecutionEnvironment instance.
+    :param path: The parent directory of the stage metadata file.
+    :return: An instance of Stage.
+    """
+    metadata = load_metadata(path)
+    module_name = metadata.get('module_name')
+    class_name = metadata.get('class_name')
+    stage_class = getattr(import_module(module_name), class_name)
+    return stage_class.load(env, path)
+
+
+def load_stage_param(path: str) -> Stage:
+    """
+    Loads the stage with the saved parameters from the given path. This method reads the metadata
+    file under the given path, instantiates the stage using its no-argument constructor, and
+    loads the stage with the param_map from the metadata file.
+
+    Note: This method does not attempt to read model data from the given path. Caller needs to
+    read model data from the given path if the stage has model data.
+
+    :param path: The parent directory of the stage metadata file.
+    :return: An stage instance.
+    """
+    metadata = load_metadata(path)
+    module_name = metadata.get('module_name')
+    class_name = metadata.get('class_name')
+    stage_class = getattr(import_module(module_name), class_name)
+    stage = stage_class()
+
+    param_map = metadata.get('param_map')
+    for k, v in param_map.items():
+        param = cloudpickle.loads(k)
+        value = param.json_decode(v)
+        stage.set(param, value)
+    return stage
+
+
+def get_path_for_pipeline_stage(stage_idx: int, num_stages: int, parent_path: str) -> str:
+    """
+    Returns a string with value {parent_path}/stages/{stage_idx}, where the stage_idx is prefixed
+    with zero or more `0` to have the same length as num_stages. The resulting string can be used
+    as the directory to save a stage of the Pipeline or PipelineModel.
+    """
+    format_str = ("{:0>%sd}" % (num_stages,))
+    return os.path.abspath(os.path.join(parent_path, "stages", format_str.format(stage_idx)))
diff --git a/flink-ml-python/pyflink/ml/version.py b/flink-ml-python/pyflink/ml/version.py
new file mode 100644
index 0000000..4d91394
--- /dev/null
+++ b/flink-ml-python/pyflink/ml/version.py
@@ -0,0 +1,23 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+"""
+The version will be consistent with the flink ml version and follow the PEP440.
+.. seealso:: https://www.python.org/dev/peps/pep-0440
+"""
+__version__ = "0.1.dev0"
diff --git a/flink-ml-python/setup.cfg b/flink-ml-python/setup.cfg
new file mode 100644
index 0000000..b29e341
--- /dev/null
+++ b/flink-ml-python/setup.cfg
@@ -0,0 +1,19 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+[metadata]
+description-file = README.md
diff --git a/flink-ml-python/setup.py b/flink-ml-python/setup.py
new file mode 100644
index 0000000..f3f8aa6
--- /dev/null
+++ b/flink-ml-python/setup.py
@@ -0,0 +1,120 @@
+################################################################################
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+import io
+import os
+import sys
+from shutil import copytree, rmtree
+
+from setuptools import setup
+
+if sys.version_info < (3, 6):
+    print("Python versions prior to 3.6 are not supported for Flink ML.",
+          file=sys.stderr)
+    sys.exit(-1)
+
+
+def remove_if_exists(file_path):
+    if os.path.exists(file_path):
+        if os.path.islink(file_path) or os.path.isfile(file_path):
+            os.remove(file_path)
+        else:
+            assert os.path.isdir(file_path)
+            rmtree(file_path)
+
+
+this_directory = os.path.abspath(os.path.dirname(__file__))
+version_file = os.path.join(this_directory, 'pyflink/ml/version.py')
+
+try:
+    exec(open(version_file).read())
+except IOError:
+    print("Failed to load Flink ML version file for packaging. " +
+          "'%s' not found!" % version_file,
+          file=sys.stderr)
+    sys.exit(-1)
+VERSION = __version__  # noqa
+
+with io.open(os.path.join(this_directory, 'README.md'), 'r', encoding='utf-8') as f:
+    long_description = f.read()
+
+TEMP_PATH = "deps"
+
+EXAMPLES_TEMP_PATH = os.path.join(TEMP_PATH, "examples")
+
+in_flink_ml_source = os.path.isfile("../flink-ml-core/src/main/java/org/apache/flink/ml/api/"
+                                    "Stage.java")
+try:
+    if in_flink_ml_source:
+
+        try:
+            os.mkdir(TEMP_PATH)
+        except:
+            print("Temp path for symlink to parent already exists {0}".format(TEMP_PATH),
+                  file=sys.stderr)
+            sys.exit(-1)
+        flink_ml_version = VERSION.replace(".dev0", "-SNAPSHOT")
+        FLINK_ML_ROOT = os.path.abspath("..")
+
+        EXAMPLES_PATH = os.path.join(this_directory, "pyflink/examples")
+
+        try:
+            os.symlink(EXAMPLES_PATH, EXAMPLES_TEMP_PATH)
+        except BaseException:  # pylint: disable=broad-except
+            copytree(EXAMPLES_PATH, EXAMPLES_TEMP_PATH)
+
+    PACKAGES = ['pyflink',
+                'pyflink.ml',
+                'pyflink.ml.core',
+                'pyflink.ml.lib',
+                'pyflink.ml.util',
+                'pyflink.examples']
+
+    PACKAGE_DIR = {
+        'pyflink.examples': TEMP_PATH + '/examples'}
+
+    PACKAGE_DATA = {
+        'pyflink.examples': ['*.py', '*/*.py']}
+
+    setup(
+        name='apache-flink-ml',
+        version=VERSION,
+        packages=PACKAGES,
+        include_package_data=True,
+        package_dir=PACKAGE_DIR,
+        package_data=PACKAGE_DATA,
+        url='https://flink.apache.org',
+        license='https://www.apache.org/licenses/LICENSE-2.0',
+        author='Apache Software Foundation',
+        author_email='dev@flink.apache.org',
+        python_requires='>=3.6',
+        install_requires=['apache-flink==1.14.0', 'pandas>=1.0,<1.2.0', 'jsonpickle==2.0.0',
+                          'cloudpickle==1.2.2'],
+        tests_require=['pytest==4.4.1'],
+        description='Apache Flink ML Python API',
+        long_description=long_description,
+        long_description_content_type='text/markdown',
+        classifiers=[
+            'Development Status :: 5 - Production/Stable',
+            'License :: OSI Approved :: Apache Software License',
+            'Programming Language :: Python :: 3.6',
+            'Programming Language :: Python :: 3.7',
+            'Programming Language :: Python :: 3.8'],
+    )
+finally:
+    if in_flink_ml_source:
+        remove_if_exists(TEMP_PATH)