You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@paimon.apache.org by lz...@apache.org on 2023/05/09 10:59:35 UTC

[incubator-paimon] branch release-0.4 updated (47f2c43c3 -> c96648ed0)

This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a change to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git


    from 47f2c43c3 [core] use append-only mode as the default write mode (#976)
     new b0b513f07 [doc] Table is append-only if no primary key
     new 3925b0dd5 [core] Add validation in Schema#Builder (#1030)
     new ba2079ee8 [Improve] add column comment for mysql cdc create table. (#1055)
     new 3091d93df [cdc] add set type for mysql cdc action (#1065)
     new d34fcdc06 [doc] Update Engines Matrix
     new 0830db278 [flink] CompactAction support catalog config (#1069)
     new 611e97aaf [doc] Add hadoop prefix option configuration way
     new b2d67c671 [core] Add addColumn refactoring method for SchemaChange class (#981)
     new 922dae558 [core] Improve SchemaChange (#1086)
     new c96648ed0 [license] Add license for bundled dependencies (#1067)

The 10 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 LICENSE                                            |  26 ++++
 NOTICE                                             |   9 ++
 docs/content/concepts/append-only-table.md         |   3 +-
 docs/content/engines/overview.md                   |  21 ++--
 docs/content/filesystems/hdfs.md                   |   1 +
 docs/content/maintenance/write-performance.md      |  24 +++-
 licenses/LICENSE.anchorjs                          |  21 ++++
 .../LICENSE.antlr-java-grammar-files               |   0
 paimon-codegen/src/main/resources/META-INF/NOTICE  |   3 +-
 .../main/resources/META-INF/licenses/LICENSE.scala |  28 +++++
 .../paimon/lookup/hash/HashLookupStoreReader.java  |  18 ++-
 .../paimon/lookup/hash/HashLookupStoreWriter.java  |  18 ++-
 .../java/org/apache/paimon/utils/TypeUtils.java    |  22 ++++
 .../org/apache/paimon/utils/VarLengthIntUtils.java |   4 +
 paimon-common/src/main/resources/META-INF/NOTICE   |  12 +-
 .../resources/META-INF/licenses/LICENSE.janino     |  31 +++++
 .../org/apache/paimon/data/DataFormatTestUtil.java |  12 ++
 .../main/java/org/apache/paimon/schema/Schema.java |  28 +++++
 .../org/apache/paimon/schema/SchemaChange.java     |  12 ++
 .../org/apache/paimon/catalog/CatalogTestBase.java | 131 ++++++++++++++++++++-
 .../apache/paimon/schema/SchemaBuilderTest.java    |  67 +++++++++++
 .../paimon/tests/cdc/MySqlIgnoreCaseE2EeTest.java  |   4 +-
 .../src/main/resources/META-INF/NOTICE             |   9 +-
 .../META-INF/licenses/LICENSE.animal-sniffer       |   9 ++
 .../licenses/LICENSE.checker-framework-qualifiers  |  22 ++++
 .../resources/META-INF/licenses/LICENSE.dnsjava    |  30 +++++
 .../src/main/resources/META-INF/NOTICE             |   6 +-
 .../resources/META-INF/licenses/LICENSE.jacoco     |  14 +++
 .../src/main/resources/META-INF/NOTICE             |   9 +-
 .../META-INF/licenses/LICENSE.animal-sniffer       |   9 ++
 .../licenses/LICENSE.checker-framework-qualifiers  |  22 ++++
 .../resources/META-INF/licenses/LICENSE.dnsjava    |  30 +++++
 .../org/apache/paimon/flink/action/Action.java     |  19 +++
 .../org/apache/paimon/flink/action/ActionBase.java |  11 +-
 .../apache/paimon/flink/action/CompactAction.java  |  27 ++++-
 .../flink/action/cdc/mysql/MySqlActionUtils.java   |   9 +-
 .../paimon/flink/action/cdc/mysql/MySqlSchema.java |  23 ++--
 .../action/cdc/mysql/MySqlSyncDatabaseAction.java  |  57 +++------
 .../cdc/mysql/MySqlSyncTableActionITCase.java      |  23 +++-
 .../src/test/resources/mysql/setup.sql             |  27 +++--
 paimon-format/src/main/resources/META-INF/NOTICE   |   2 +-
 .../resources/META-INF/licenses/LICENSE.protobuf   |  30 ++---
 .../src/main/resources-filtered/META-INF/NOTICE    |  16 ---
 43 files changed, 730 insertions(+), 169 deletions(-)
 create mode 100644 licenses/LICENSE.anchorjs
 rename {paimon-common/src/main/resources/META-INF/licenses => licenses}/LICENSE.antlr-java-grammar-files (100%)
 create mode 100644 paimon-codegen/src/main/resources/META-INF/licenses/LICENSE.scala
 create mode 100644 paimon-common/src/main/resources/META-INF/licenses/LICENSE.janino
 create mode 100644 paimon-core/src/test/java/org/apache/paimon/schema/SchemaBuilderTest.java
 create mode 100644 paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer
 create mode 100644 paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers
 create mode 100644 paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.dnsjava
 create mode 100644 paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/licenses/LICENSE.jacoco
 create mode 100644 paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer
 create mode 100644 paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers
 create mode 100644 paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.dnsjava
 copy paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE-re2j => paimon-format/src/main/resources/META-INF/licenses/LICENSE.protobuf (52%)
 delete mode 100644 paimon-hive/paimon-hive-catalog/src/main/resources-filtered/META-INF/NOTICE


[incubator-paimon] 10/10: [license] Add license for bundled dependencies (#1067)

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit c96648ed0c09d2bf79822cf3c993a026f82b0ecc
Author: Jingsong Lee <ji...@gmail.com>
AuthorDate: Tue May 9 18:04:19 2023 +0800

    [license] Add license for bundled dependencies (#1067)
---
 LICENSE                                            | 26 ++++++++++++++++++
 NOTICE                                             |  9 ++++++
 licenses/LICENSE.anchorjs                          | 21 ++++++++++++++
 .../LICENSE.antlr-java-grammar-files               |  0
 paimon-codegen/src/main/resources/META-INF/NOTICE  |  3 +-
 .../main/resources/META-INF/licenses/LICENSE.scala | 28 +++++++++++++++++++
 .../paimon/lookup/hash/HashLookupStoreReader.java  | 18 +++++-------
 .../paimon/lookup/hash/HashLookupStoreWriter.java  | 18 +++++-------
 .../org/apache/paimon/utils/VarLengthIntUtils.java |  4 +++
 paimon-common/src/main/resources/META-INF/NOTICE   | 12 +-------
 .../resources/META-INF/licenses/LICENSE.janino     | 31 +++++++++++++++++++++
 .../src/main/resources/META-INF/NOTICE             |  9 +++---
 .../META-INF/licenses/LICENSE.animal-sniffer       |  9 ++++++
 .../licenses/LICENSE.checker-framework-qualifiers  | 22 +++++++++++++++
 .../resources/META-INF/licenses/LICENSE.dnsjava    | 30 ++++++++++++++++++++
 .../src/main/resources/META-INF/NOTICE             |  6 ++--
 .../resources/META-INF/licenses/LICENSE.jacoco     | 14 ++++++++++
 .../src/main/resources/META-INF/NOTICE             |  9 +++---
 .../META-INF/licenses/LICENSE.animal-sniffer       |  9 ++++++
 .../licenses/LICENSE.checker-framework-qualifiers  | 22 +++++++++++++++
 .../resources/META-INF/licenses/LICENSE.dnsjava    | 30 ++++++++++++++++++++
 paimon-format/src/main/resources/META-INF/NOTICE   |  2 +-
 .../resources/META-INF/licenses/LICENSE.protobuf   | 32 ++++++++++++++++++++++
 .../src/main/resources-filtered/META-INF/NOTICE    | 16 -----------
 24 files changed, 319 insertions(+), 61 deletions(-)

diff --git a/LICENSE b/LICENSE
index 261eeb9e9..d98f6ed90 100644
--- a/LICENSE
+++ b/LICENSE
@@ -199,3 +199,29 @@
    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    See the License for the specific language governing permissions and
    limitations under the License.
+
+------------------------------------------------------------------------------------
+This product bundles various third-party components under other open source licenses.
+This section summarizes those components and their licenses. See licenses/
+for text of these licenses.
+
+Apache Software Foundation License 2.0
+--------------------------------------
+
+paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreWriter.java
+paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreReader.java
+paimon-common/src/main/java/org/apache/paimon/utils/VarLengthIntUtils.java
+paimon-filesystems/paimon-s3-impl/src/main/java/com/amazonaws/services/s3/model/transform/XmlResponsesSaxParser.java
+
+
+MIT License
+-----------
+
+docs/static/js/anchor.min.js
+
+
+BSD License
+------------
+
+paimon-common/src/main/antlr4/JavaLexer.g4
+paimon-common/src/main/antlr4/JavaParser.g4
diff --git a/NOTICE b/NOTICE
index 6f7d8e059..a1502d1e1 100644
--- a/NOTICE
+++ b/NOTICE
@@ -6,3 +6,12 @@ The Apache Software Foundation (http://www.apache.org/).
 
 Apache Flink
 Copyright 2014-2023 The Apache Software Foundation
+
+Apache Hadoop
+Copyright 2006 and onwards The Apache Software Foundation.
+
+PalDB
+Copyright 2015 LinkedIn Corp
+
+AWS SDK for Java
+Copyright 2010-2014 Amazon.com, Inc. or its affiliates
diff --git a/licenses/LICENSE.anchorjs b/licenses/LICENSE.anchorjs
new file mode 100644
index 000000000..1ee872674
--- /dev/null
+++ b/licenses/LICENSE.anchorjs
@@ -0,0 +1,21 @@
+MIT License
+
+Copyright (c) 2021 Bryan Braun
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
\ No newline at end of file
diff --git a/paimon-common/src/main/resources/META-INF/licenses/LICENSE.antlr-java-grammar-files b/licenses/LICENSE.antlr-java-grammar-files
similarity index 100%
rename from paimon-common/src/main/resources/META-INF/licenses/LICENSE.antlr-java-grammar-files
rename to licenses/LICENSE.antlr-java-grammar-files
diff --git a/paimon-codegen/src/main/resources/META-INF/NOTICE b/paimon-codegen/src/main/resources/META-INF/NOTICE
index 130fcf845..81a9de97a 100644
--- a/paimon-codegen/src/main/resources/META-INF/NOTICE
+++ b/paimon-codegen/src/main/resources/META-INF/NOTICE
@@ -4,7 +4,8 @@ Copyright 2023-2023 The Apache Software Foundation
 This product includes software developed at
 The Apache Software Foundation (http://www.apache.org/).
 
-The following dependencies all share the same BSD license which you find under licenses/LICENSE.scala.
+The following dependencies all share the same BSD license.
+You find it under licenses/LICENSE.scala.
 
 - org.scala-lang:scala-compiler:2.12.7
 - org.scala-lang:scala-library:2.12.7
diff --git a/paimon-codegen/src/main/resources/META-INF/licenses/LICENSE.scala b/paimon-codegen/src/main/resources/META-INF/licenses/LICENSE.scala
new file mode 100644
index 000000000..cad3ba745
--- /dev/null
+++ b/paimon-codegen/src/main/resources/META-INF/licenses/LICENSE.scala
@@ -0,0 +1,28 @@
+Copyright (c) 2002-2018 EPFL
+  Copyright (c) 2011-2018 Lightbend, Inc.
+
+  All rights reserved.
+
+  Redistribution and use in source and binary forms, with or without modification,
+are permitted provided that the following conditions are met:
+
+  * Redistributions of source code must retain the above copyright notice,
+this list of conditions and the following disclaimer.
+* Redistributions in binary form must reproduce the above copyright notice,
+this list of conditions and the following disclaimer in the documentation
+and/or other materials provided with the distribution.
+* Neither the name of the EPFL nor the names of its contributors
+may be used to endorse or promote products derived from this software
+  without specific prior written permission.
+
+  THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+  "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+  LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+  A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
+  CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+  PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
+LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
+  NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
+SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
\ No newline at end of file
diff --git a/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreReader.java b/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreReader.java
index 0407d0450..b55d96df8 100644
--- a/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreReader.java
+++ b/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreReader.java
@@ -1,22 +1,18 @@
 /*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
+ * Copyright 2015 LinkedIn Corp. All rights reserved.
  *
- *     http://www.apache.org/licenses/LICENSE-2.0
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
  *
  * Unless required by applicable law or agreed to in writing, software
  * distributed under the License is distributed on an "AS IS" BASIS,
  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
  */
 
-/* This file is based on source code from the PalDB Project (https://github.com/linkedin/PalDB), licensed by the Apache
+/* This file is based on source code of StorageReader from the PalDB Project (https://github.com/linkedin/PalDB), licensed by the Apache
  * Software Foundation (ASF) under the Apache License, Version 2.0. See the NOTICE file distributed with this work for
  * additional information regarding copyright ownership. */
 
diff --git a/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreWriter.java b/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreWriter.java
index ce946e841..66b811f60 100644
--- a/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreWriter.java
+++ b/paimon-common/src/main/java/org/apache/paimon/lookup/hash/HashLookupStoreWriter.java
@@ -1,22 +1,18 @@
 /*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
+ * Copyright 2015 LinkedIn Corp. All rights reserved.
  *
- *     http://www.apache.org/licenses/LICENSE-2.0
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
  *
  * Unless required by applicable law or agreed to in writing, software
  * distributed under the License is distributed on an "AS IS" BASIS,
  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
  */
 
-/* This file is based on source code from the PalDB Project (https://github.com/linkedin/PalDB), licensed by the Apache
+/* This file is based on source code of StorageWriter from the PalDB Project (https://github.com/linkedin/PalDB), licensed by the Apache
  * Software Foundation (ASF) under the Apache License, Version 2.0. See the NOTICE file distributed with this work for
  * additional information regarding copyright ownership. */
 
diff --git a/paimon-common/src/main/java/org/apache/paimon/utils/VarLengthIntUtils.java b/paimon-common/src/main/java/org/apache/paimon/utils/VarLengthIntUtils.java
index f5f476d49..99f7ff146 100644
--- a/paimon-common/src/main/java/org/apache/paimon/utils/VarLengthIntUtils.java
+++ b/paimon-common/src/main/java/org/apache/paimon/utils/VarLengthIntUtils.java
@@ -12,6 +12,10 @@
  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  */
 
+/* This file is based on source code of LongPacker from the PalDB Project (https://github.com/linkedin/PalDB), licensed by the Apache
+ * Software Foundation (ASF) under the Apache License, Version 2.0. See the NOTICE file distributed with this work for
+ * additional information regarding copyright ownership. */
+
 package org.apache.paimon.utils;
 
 import java.io.DataInput;
diff --git a/paimon-common/src/main/resources/META-INF/NOTICE b/paimon-common/src/main/resources/META-INF/NOTICE
index da5a1a3a5..0ef216eab 100644
--- a/paimon-common/src/main/resources/META-INF/NOTICE
+++ b/paimon-common/src/main/resources/META-INF/NOTICE
@@ -4,19 +4,9 @@ Copyright 2023-2023 The Apache Software Foundation
 This product includes software developed at
 The Apache Software Foundation (http://www.apache.org/).
 
-This project bundles the following dependencies under the Apache Software License 2.0. (http://www.apache.org/licenses/LICENSE-2.0.txt)
-
-- com.linkedin.paldb:paldb:1.2.0
-
 This project bundles the following dependencies under the BSD 3-clause license.
-See bundled license files for details.
+You find them under licenses/LICENSE.antlr-runtime and licenses/LICENSE.janino.
 
 - org.antlr:antlr4-runtime:4.7
 - org.codehaus.janino:janino:3.0.11
 - org.codehaus.janino:commons-compiler:3.0.11
-
-This project bundles the following files under the BSD license.
-See bundled license files for details.
-
-- Antlr Java grammar files (https://github.com/antlr/grammars-v4/tree/master/java/java)
-  -> in src/main/antlr4
diff --git a/paimon-common/src/main/resources/META-INF/licenses/LICENSE.janino b/paimon-common/src/main/resources/META-INF/licenses/LICENSE.janino
new file mode 100644
index 000000000..ef871e242
--- /dev/null
+++ b/paimon-common/src/main/resources/META-INF/licenses/LICENSE.janino
@@ -0,0 +1,31 @@
+Janino - An embedded Java[TM] compiler
+
+Copyright (c) 2001-2016, Arno Unkrig
+Copyright (c) 2015-2016  TIBCO Software Inc.
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions
+are met:
+
+   1. Redistributions of source code must retain the above copyright
+      notice, this list of conditions and the following disclaimer.
+   2. Redistributions in binary form must reproduce the above
+      copyright notice, this list of conditions and the following
+      disclaimer in the documentation and/or other materials
+      provided with the distribution.
+   3. Neither the name of JANINO nor the names of its contributors
+      may be used to endorse or promote products derived from this
+      software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS OR CONTRIBUTORS BE
+LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER
+IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR
+OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
+IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/NOTICE b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/NOTICE
index 6d56d19ff..2a106d057 100644
--- a/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/NOTICE
+++ b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/NOTICE
@@ -33,22 +33,23 @@ This project bundles the following dependencies under the Apache Software Licens
 - org.apache.kerby:kerby-util:1.0.1
 - org.xerial.snappy:snappy-java:1.1.8.3
 
-This project bundles the following dependencies under the MIT (https://opensource.org/licenses/MIT)
+This project bundles the following dependencies under the MIT (https://opensource.org/licenses/MIT).
+You find them under licenses/LICENSE.checker-framework-qualifiers and licenses/LICENSE.animal-sniffer.
 
 - org.checkerframework:checker-qual:2.5.2
 - org.codehaus.mojo:animal-sniffer-annotations:1.17
 
 This project bundles the following dependencies under BSD-2 License (https://opensource.org/licenses/BSD-2-Clause).
-See bundled license files for details.
+You find it under licenses/LICENSE.dnsjava.
 
 - dnsjava:dnsjava:2.1.7
 
 This project bundles the following dependencies under the Go License (https://golang.org/LICENSE).
-See bundled license files for details.
+You find it under licenses/LICENSE.re2j.
 
 - com.google.re2j:re2j:1.1
 
 This project bundles the following dependencies under BSD License (https://opensource.org/licenses/bsd-license.php).
-See bundled license files for details.
+You find it under licenses/LICENSE.stax2api.
 
 - org.codehaus.woodstox:stax2-api:4.2.1 (https://github.com/FasterXML/stax2-api/tree/stax2-api-4.2.1)
diff --git a/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer
new file mode 100644
index 000000000..3f28a5f53
--- /dev/null
+++ b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer
@@ -0,0 +1,9 @@
+MIT License
+
+Copyright (c) <year> <copyright holders>
+
+Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
\ No newline at end of file
diff --git a/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers
new file mode 100644
index 000000000..7b59b5c98
--- /dev/null
+++ b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers
@@ -0,0 +1,22 @@
+Checker Framework qualifiers
+Copyright 2004-present by the Checker Framework developers
+
+MIT License:
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+THE SOFTWARE.
\ No newline at end of file
diff --git a/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.dnsjava b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.dnsjava
new file mode 100644
index 000000000..8daf3fc25
--- /dev/null
+++ b/paimon-filesystems/paimon-hadoop-shaded/src/main/resources/META-INF/licenses/LICENSE.dnsjava
@@ -0,0 +1,30 @@
+Copyright (c) 1998-2019, Brian Wellington
+Copyright (c) 2005 VeriSign. All rights reserved.
+Copyright (c) 2019-2021, dnsjava authors
+
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are met:
+
+1. Redistributions of source code must retain the above copyright notice, this
+   list of conditions and the following disclaimer.
+
+2. Redistributions in binary form must reproduce the above copyright notice,
+   this list of conditions and the following disclaimer in the documentation
+   and/or other materials provided with the distribution.
+
+3. Neither the name of the copyright holder nor the names of its
+   contributors may be used to endorse or promote products derived from
+   this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
+FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
+SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
+CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
+OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
\ No newline at end of file
diff --git a/paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/NOTICE b/paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/NOTICE
index e8ca2b7c9..4d3bba91b 100644
--- a/paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/NOTICE
+++ b/paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/NOTICE
@@ -25,10 +25,12 @@ This project bundles the following dependencies under the Apache Software Licens
 - org.ini4j:ini4j:0.5.4
 - stax:stax-api:1.0.1
 
-The binary distribution of this product bundles these dependencies under the Eclipse Public License - v 2.0 (https://www.eclipse.org/org/documents/epl-2.0/EPL-2.0.txt)
+The binary distribution of this product bundles these dependencies under the Eclipse Public License - v 2.0
+You find it under licenses/LICENSE.jacoco.
+
 - org.jacoco:org.jacoco.agent:runtime:0.8.5
 
 This project bundles the following dependencies under the JDOM license.
-See bundled license files for details.
+You find it under licenses/LICENSE.jdom.
 
 - org.jdom:jdom2:2.0.6
diff --git a/paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/licenses/LICENSE.jacoco b/paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/licenses/LICENSE.jacoco
new file mode 100644
index 000000000..47ff775ba
--- /dev/null
+++ b/paimon-filesystems/paimon-oss-impl/src/main/resources/META-INF/licenses/LICENSE.jacoco
@@ -0,0 +1,14 @@
+License
+=======
+
+Copyright (c) 2009, 2019 Mountainminds GmbH & Co. KG and Contributors
+
+The JaCoCo Java Code Coverage Library and all included documentation is made
+available by Mountainminds GmbH & Co. KG, Munich. Except indicated below, the
+Content is provided to you under the terms and conditions of the Eclipse Public
+License Version 2.0 ("EPL"). A copy of the EPL is available at
+[https://www.eclipse.org/legal/epl-2.0/](https://www.eclipse.org/legal/epl-2.0/).
+
+Please visit
+[http://www.jacoco.org/jacoco/trunk/doc/license.html](http://www.jacoco.org/jacoco/trunk/doc/license.html)
+for the complete license information including third party licenses and trademarks.
\ No newline at end of file
diff --git a/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/NOTICE b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/NOTICE
index 16d759dca..9c552f765 100644
--- a/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/NOTICE
+++ b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/NOTICE
@@ -48,27 +48,28 @@ This project bundles the following dependencies under the Apache Software Licens
 - software.amazon.ion:ion-java:1.0.2
 
 This project bundles the following dependencies under BSD-2 License (https://opensource.org/licenses/BSD-2-Clause).
-See bundled license files for details.
+You find it under licenses/LICENSE.dnsjava.
 
 - dnsjava:dnsjava:2.1.7
 
 This project bundles the following dependencies under the MIT (https://opensource.org/licenses/MIT)
+You find them under licenses/LICENSE.checker-framework-qualifiers and licenses/LICENSE.animal-sniffer.
 
 - org.checkerframework:checker-qual:2.5.2
 - org.codehaus.mojo:animal-sniffer-annotations:1.17
 
 This project bundles the following dependencies under the CDDL 1.1 license.
-See bundled license files for details.
+You find it under licenses/LICENSE.jaxb.
 
 - javax.xml.bind:jaxb-api:2.3.1
 
 This project bundles the following dependencies under the Go License (https://golang.org/LICENSE).
-See bundled license files for details.
+You find it under licenses/LICENSE.re2j.
 
 - com.google.re2j:re2j:1.1
 
 This project bundles the following dependencies under BSD License (https://opensource.org/licenses/bsd-license.php).
-See bundled license files for details.
+You find it under licenses/LICENSE.stax2api.
 
 - org.codehaus.woodstox:stax2-api:4.2.1 (https://github.com/FasterXML/stax2-api/tree/stax2-api-4.2.1)
 
diff --git a/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer
new file mode 100644
index 000000000..3f28a5f53
--- /dev/null
+++ b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.animal-sniffer
@@ -0,0 +1,9 @@
+MIT License
+
+Copyright (c) <year> <copyright holders>
+
+Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
\ No newline at end of file
diff --git a/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers
new file mode 100644
index 000000000..7b59b5c98
--- /dev/null
+++ b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.checker-framework-qualifiers
@@ -0,0 +1,22 @@
+Checker Framework qualifiers
+Copyright 2004-present by the Checker Framework developers
+
+MIT License:
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+THE SOFTWARE.
\ No newline at end of file
diff --git a/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.dnsjava b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.dnsjava
new file mode 100644
index 000000000..8daf3fc25
--- /dev/null
+++ b/paimon-filesystems/paimon-s3-impl/src/main/resources/META-INF/licenses/LICENSE.dnsjava
@@ -0,0 +1,30 @@
+Copyright (c) 1998-2019, Brian Wellington
+Copyright (c) 2005 VeriSign. All rights reserved.
+Copyright (c) 2019-2021, dnsjava authors
+
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are met:
+
+1. Redistributions of source code must retain the above copyright notice, this
+   list of conditions and the following disclaimer.
+
+2. Redistributions in binary form must reproduce the above copyright notice,
+   this list of conditions and the following disclaimer in the documentation
+   and/or other materials provided with the distribution.
+
+3. Neither the name of the copyright holder nor the names of its
+   contributors may be used to endorse or promote products derived from
+   this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
+FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
+SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
+CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
+OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
\ No newline at end of file
diff --git a/paimon-format/src/main/resources/META-INF/NOTICE b/paimon-format/src/main/resources/META-INF/NOTICE
index 46eed083f..ccc807ce4 100644
--- a/paimon-format/src/main/resources/META-INF/NOTICE
+++ b/paimon-format/src/main/resources/META-INF/NOTICE
@@ -28,6 +28,6 @@ This project bundles the following dependencies under the Apache Software Licens
 - commons-pool:commons-pool:1.6
 
 This project bundles the following dependencies under the BSD license.
-See bundled license files for details.
+You find it under licenses/LICENSE.protobuf.
 
 - com.google.protobuf:protobuf-java:2.5.0
diff --git a/paimon-format/src/main/resources/META-INF/licenses/LICENSE.protobuf b/paimon-format/src/main/resources/META-INF/licenses/LICENSE.protobuf
new file mode 100644
index 000000000..19b305b00
--- /dev/null
+++ b/paimon-format/src/main/resources/META-INF/licenses/LICENSE.protobuf
@@ -0,0 +1,32 @@
+Copyright 2008 Google Inc.  All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+    * Redistributions of source code must retain the above copyright
+notice, this list of conditions and the following disclaimer.
+    * Redistributions in binary form must reproduce the above
+copyright notice, this list of conditions and the following disclaimer
+in the documentation and/or other materials provided with the
+distribution.
+    * Neither the name of Google Inc. nor the names of its
+contributors may be used to endorse or promote products derived from
+this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+Code generated by the Protocol Buffer compiler is owned by the owner
+of the input file used when generating it.  This code is not
+standalone and requires a support library to be linked with it.  This
+support library is itself covered by the above license.
diff --git a/paimon-hive/paimon-hive-catalog/src/main/resources-filtered/META-INF/NOTICE b/paimon-hive/paimon-hive-catalog/src/main/resources-filtered/META-INF/NOTICE
deleted file mode 100644
index c05d034e3..000000000
--- a/paimon-hive/paimon-hive-catalog/src/main/resources-filtered/META-INF/NOTICE
+++ /dev/null
@@ -1,16 +0,0 @@
-paimon-hive-catalog
-Copyright 2023-2023 The Apache Software Foundation
-
-This product includes software developed at
-The Apache Software Foundation (http://www.apache.org/).
-
-This project bundles the following dependencies under the Apache Software License 2.0. (http://www.apache.org/licenses/LICENSE-2.0.txt)
-
-- com.google.guava:guava:14.0.1
-- org.apache.hive:hive-common:${hive.version}
-- org.apache.hive:hive-metastore:${hive.version}
-- org.apache.hive:hive-serde:${hive.version}
-- org.apache.hive.shims:hive-shims-common:${hive.version}
-- org.apache.hive.shims:hive-shims-0.23:${hive.version}
-- org.apache.thrift:libfb303:0.9.3
-- org.apache.thrift:libthrift:0.9.3


[incubator-paimon] 05/10: [doc] Update Engines Matrix

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit d34fcdc06eb18462f13979279985e4b3782e2fc5
Author: JingsongLi <lz...@aliyun.com>
AuthorDate: Sat May 6 15:31:56 2023 +0800

    [doc] Update Engines Matrix
---
 docs/content/engines/overview.md | 21 +++++++++++++--------
 1 file changed, 13 insertions(+), 8 deletions(-)

diff --git a/docs/content/engines/overview.md b/docs/content/engines/overview.md
index ea546d955..2168b0f1f 100644
--- a/docs/content/engines/overview.md
+++ b/docs/content/engines/overview.md
@@ -32,13 +32,18 @@ Apache Spark and Apache Hive.
 
 ## Compatibility Matrix
 
-| Engine | Version                     | Feature                                                                              | Read Pushdown      |
-|--------|-----------------------------|--------------------------------------------------------------------------------------|--------------------|
-| Flink  | 1.17/1.16/1.15/1.14         | batch/streaming read, batch/streaming write, create/drop table, create/drop database | Projection, Filter |
-| Hive   | 3.1/2.3/2.2/2.1/2.1 CDH 6.3 | batch read                                                                           | Projection, Filter |
-| Spark  | 3.3/3.2/3.1                 | batch read, batch write, create/drop table, create/drop database                     | Projection, Filter |
-| Spark  | 2.4                         | batch read                                                                           | Projection, Filter |
-| Trino  | 388/358                     | batch read, create/drop table, create/drop database                                  | Projection, Filter |
-| Presto | 0.236 and above             | batch read                                                                           | Projection, Filter |
+| Engine    | Version       | Batch Read | Batch Write | Create Table | Streaming Write | Streaming Read | Batch Overwrite |
+|:---------:|:-------------:|:----------:|:-----------:|:------------:|:---------------:|:--------------:|:---------------:|
+| Flink     | 1.14 - 1.17   |   ✅       |   ✅         |   ✅         |   ✅            |   ✅            |   ✅            |
+| Hive      | 2.1 - 3.1     |   ✅       |   ✅         |   ❌         |   ❌            |   ❌            |   ❌            |
+| Spark     | 3.1 - 3.4     |   ✅       |   ✅         |   ✅         |   ❌            |   ❌            |   ❌            |
+| Spark     | 2.4           |   ✅       |   ❌         |   ❌         |   ❌            |   ❌            |   ❌            |
+| Trino     | 358 - 400     |   ✅       |   ❌         |   ❌         |   ❌            |   ❌            |   ❌            |
+| Presto    | 0.236 - 0.279 |   ✅       |   ❌         |   ❌         |   ❌            |   ❌            |   ❌            |
+
+Ongoing engines:
+- Doris: Under development, [Support Paimon catalog](https://github.com/apache/doris/issues/18433), [Doris Roadmap 2023](https://github.com/apache/doris/issues/16392).
+- Seatunnel: Under development, [Introduce paimon connector](https://github.com/apache/incubator-seatunnel/pull/4178).
+- Starrocks: Under discussion
 
 [Download Link]({{< ref "project/download#engine-jars" >}})
\ No newline at end of file


[incubator-paimon] 09/10: [core] Improve SchemaChange (#1086)

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit 922dae558997d7777fdd4634c63e24db07e0b843
Author: s7monk <34...@users.noreply.github.com>
AuthorDate: Tue May 9 11:24:06 2023 +0800

    [core] Improve SchemaChange (#1086)
---
 .../org/apache/paimon/schema/SchemaChange.java     |   8 ++
 .../org/apache/paimon/catalog/CatalogTestBase.java | 121 +++++++++++++++++++++
 2 files changed, 129 insertions(+)

diff --git a/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java b/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java
index 574a3c8f2..28c49af7c 100644
--- a/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java
+++ b/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java
@@ -67,10 +67,18 @@ public interface SchemaChange extends Serializable {
         return new UpdateColumnType(fieldName, newDataType);
     }
 
+    static SchemaChange updateColumnNullability(String fieldName, boolean newNullability) {
+        return new UpdateColumnNullability(new String[] {fieldName}, newNullability);
+    }
+
     static SchemaChange updateColumnNullability(String[] fieldNames, boolean newNullability) {
         return new UpdateColumnNullability(fieldNames, newNullability);
     }
 
+    static SchemaChange updateColumnComment(String fieldName, String comment) {
+        return new UpdateColumnComment(new String[] {fieldName}, comment);
+    }
+
     static SchemaChange updateColumnComment(String[] fieldNames, String comment) {
         return new UpdateColumnComment(fieldNames, comment);
     }
diff --git a/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java b/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java
index f2cb60c19..b5892f729 100644
--- a/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java
+++ b/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java
@@ -27,6 +27,7 @@ import org.apache.paimon.schema.SchemaChange;
 import org.apache.paimon.table.Table;
 import org.apache.paimon.types.DataField;
 import org.apache.paimon.types.DataTypes;
+import org.apache.paimon.types.RowType;
 
 import org.apache.paimon.shade.guava30.com.google.common.collect.Lists;
 import org.apache.paimon.shade.guava30.com.google.common.collect.Maps;
@@ -613,4 +614,124 @@ public abstract class CatalogTestBase {
                 .hasRootCauseInstanceOf(IllegalArgumentException.class)
                 .hasMessageContaining("Cannot update partition column [dt] type in the table");
     }
+
+    @Test
+    public void testAlterTableUpdateColumnComment() throws Exception {
+        catalog.createDatabase("test_db", false);
+
+        // Alter table update a column comment in an existing table
+        Identifier identifier = Identifier.create("test_db", "test_table");
+        catalog.createTable(
+                identifier,
+                new Schema(
+                        Lists.newArrayList(
+                                new DataField(0, "col1", DataTypes.STRING(), "field1"),
+                                new DataField(1, "col2", DataTypes.STRING(), "field2"),
+                                new DataField(
+                                        2,
+                                        "col3",
+                                        DataTypes.ROW(
+                                                new DataField(4, "f1", DataTypes.STRING(), "f1"),
+                                                new DataField(5, "f2", DataTypes.STRING(), "f2"),
+                                                new DataField(6, "f3", DataTypes.STRING(), "f3")),
+                                        "field3")),
+                        Collections.emptyList(),
+                        Collections.emptyList(),
+                        Maps.newHashMap(),
+                        ""),
+                false);
+
+        catalog.alterTable(
+                identifier,
+                Lists.newArrayList(SchemaChange.updateColumnComment("col2", "col2 field")),
+                false);
+
+        // Update nested column
+        String[] fields = new String[] {"col3", "f1"};
+        catalog.alterTable(
+                identifier,
+                Lists.newArrayList(SchemaChange.updateColumnComment(fields, "col3 f1 field")),
+                false);
+
+        Table table = catalog.getTable(identifier);
+        assertThat(table.rowType().getFields().get(1).description()).isEqualTo("col2 field");
+        RowType rowType = (RowType) table.rowType().getFields().get(2).type();
+        assertThat(rowType.getFields().get(0).description()).isEqualTo("col3 f1 field");
+
+        // Alter table update a column comment throws Exception when column does not exist
+        assertThatThrownBy(
+                        () ->
+                                catalog.alterTable(
+                                        identifier,
+                                        Lists.newArrayList(
+                                                SchemaChange.updateColumnComment(
+                                                        new String[] {"non_existing_col"}, "")),
+                                        false))
+                .hasMessageContaining("Can not find column: [non_existing_col]");
+    }
+
+    @Test
+    public void testAlterTableUpdateColumnNullability() throws Exception {
+        catalog.createDatabase("test_db", false);
+
+        // Alter table update a column nullability in an existing table
+        Identifier identifier = Identifier.create("test_db", "test_table");
+        catalog.createTable(
+                identifier,
+                new Schema(
+                        Lists.newArrayList(
+                                new DataField(0, "col1", DataTypes.STRING(), "field1"),
+                                new DataField(1, "col2", DataTypes.STRING(), "field2"),
+                                new DataField(
+                                        2,
+                                        "col3",
+                                        DataTypes.ROW(
+                                                new DataField(4, "f1", DataTypes.STRING(), "f1"),
+                                                new DataField(5, "f2", DataTypes.STRING(), "f2"),
+                                                new DataField(6, "f3", DataTypes.STRING(), "f3")),
+                                        "field3")),
+                        Lists.newArrayList("col1"),
+                        Lists.newArrayList("col1", "col2"),
+                        Maps.newHashMap(),
+                        ""),
+                false);
+
+        catalog.alterTable(
+                identifier,
+                Lists.newArrayList(SchemaChange.updateColumnNullability("col1", false)),
+                false);
+
+        // Update nested column
+        String[] fields = new String[] {"col3", "f1"};
+        catalog.alterTable(
+                identifier,
+                Lists.newArrayList(SchemaChange.updateColumnNullability(fields, false)),
+                false);
+
+        Table table = catalog.getTable(identifier);
+        assertThat(table.rowType().getFields().get(0).type().isNullable()).isEqualTo(false);
+
+        // Alter table update a column nullability throws Exception when column does not exist
+        assertThatThrownBy(
+                        () ->
+                                catalog.alterTable(
+                                        identifier,
+                                        Lists.newArrayList(
+                                                SchemaChange.updateColumnNullability(
+                                                        new String[] {"non_existing_col"}, false)),
+                                        false))
+                .hasMessageContaining("Can not find column: [non_existing_col]");
+
+        // Alter table update a column nullability throws Exception when column is pk columns
+        assertThatThrownBy(
+                        () ->
+                                catalog.alterTable(
+                                        identifier,
+                                        Lists.newArrayList(
+                                                SchemaChange.updateColumnNullability(
+                                                        new String[] {"col2"}, true)),
+                                        false))
+                .hasRootCauseInstanceOf(UnsupportedOperationException.class)
+                .hasMessageContaining("Cannot change nullability of primary key");
+    }
 }


[incubator-paimon] 06/10: [flink] CompactAction support catalog config (#1069)

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit 0830db2785e6cd54f2e97ee6d200e4f8b7356f9d
Author: Daixinyu <da...@users.noreply.github.com>
AuthorDate: Sat May 6 15:33:50 2023 +0800

    [flink] CompactAction support catalog config (#1069)
---
 docs/content/maintenance/write-performance.md      | 24 ++++++++-
 .../org/apache/paimon/flink/action/Action.java     | 19 ++++++++
 .../org/apache/paimon/flink/action/ActionBase.java | 11 ++---
 .../apache/paimon/flink/action/CompactAction.java  | 27 ++++++++--
 .../action/cdc/mysql/MySqlSyncDatabaseAction.java  | 57 +++++++---------------
 5 files changed, 85 insertions(+), 53 deletions(-)

diff --git a/docs/content/maintenance/write-performance.md b/docs/content/maintenance/write-performance.md
index 467b0bc01..1dd82cf3b 100644
--- a/docs/content/maintenance/write-performance.md
+++ b/docs/content/maintenance/write-performance.md
@@ -165,9 +165,12 @@ Run the following command to submit a compaction job for the table.
     /path/to/paimon-flink-**-{{< version >}}.jar \
     compact \
     --warehouse <warehouse-path> \
-    --database <database-name> \
-    --table <table-name>
+    --database <database-name> \ 
+    --table <table-name> \
+    [--partition <partition-name>] \
+    [--catalog-conf <paimon-catalog-conf> [--catalog-conf <paimon-catalog-conf> ...]] \
 ```
+* `--catalog-conf` is the configuration for Paimon catalog. Each configuration should be specified in the format `key=value`. See [here]({{< ref "maintenance/configurations" >}}) for a complete list of catalog configurations.
 
 If you submit a batch job (set `execution.runtime-mode: batch` in Flink's configuration), all current table files will be compacted. If you submit a streaming job (set `execution.runtime-mode: streaming` in Flink's configuration), the job will continuously monitor new changes to the table and perform compactions as needed.
 
@@ -177,6 +180,23 @@ If you only want to submit the compaction job and don't want to wait until the j
 
 {{< /hint >}}
 
+Example
+
+```bash
+<FLINK_HOME>/bin/flink run \
+    -c org.apache.paimon.flink.action.FlinkActions \
+    /path/to/paimon-flink-**-{{< version >}}.jar \
+    compact \
+    --warehouse s3:///path/to/warehouse \
+    --database test_db \
+    --table test_table \
+    --partition dt=20221126,hh=08 \
+    --partition dt=20221127,hh=09 \
+    --catalog-conf s3.endpoint=https://****.com \
+    --catalog-conf s3.access-key=***** \
+    --catalog-conf s3.secret-key=*****
+```
+
 For more usage of the compact action, see
 
 ```bash
diff --git a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/Action.java b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/Action.java
index b1dc1204c..7cb6ca8d3 100644
--- a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/Action.java
+++ b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/Action.java
@@ -161,4 +161,23 @@ public interface Action {
             System.out.println("For detailed options of each action, run <action> --help");
         }
     }
+
+    static Optional<Map<String, String>> getConfigMap(MultipleParameterTool params, String key) {
+        if (!params.has(key)) {
+            return Optional.empty();
+        }
+
+        Map<String, String> map = new HashMap<>();
+        for (String param : params.getMultiParameter(key)) {
+            String[] kv = param.split("=");
+            if (kv.length == 2) {
+                map.put(kv[0], kv[1]);
+                continue;
+            }
+
+            System.err.println("Invalid key " + key + ". Please use format 'key=value'");
+            return Optional.empty();
+        }
+        return Optional.of(map);
+    }
 }
diff --git a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/ActionBase.java b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/ActionBase.java
index 528b68f25..de310148c 100644
--- a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/ActionBase.java
+++ b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/ActionBase.java
@@ -78,12 +78,10 @@ public abstract class ActionBase implements Action {
         this(warehouse, databaseName, tableName, new Options());
     }
 
-    ActionBase(String warehouse, String databaseName, String tableName, Options options) {
+    ActionBase(String warehouse, String databaseName, String tableName, Options catalogOptions) {
+        catalogOptions.set(CatalogOptions.WAREHOUSE, warehouse);
         identifier = new Identifier(databaseName, tableName);
-        catalog =
-                CatalogFactory.createCatalog(
-                        CatalogContext.create(
-                                new Options().set(CatalogOptions.WAREHOUSE, warehouse)));
+        catalog = CatalogFactory.createCatalog(CatalogContext.create(catalogOptions));
         flinkCatalog = new FlinkCatalog(catalog, catalogName, DEFAULT_DATABASE);
 
         env = StreamExecutionEnvironment.getExecutionEnvironment();
@@ -95,9 +93,6 @@ public abstract class ActionBase implements Action {
 
         try {
             table = catalog.getTable(identifier);
-            if (options.toMap().size() > 0) {
-                table = table.copy(options.toMap());
-            }
         } catch (Catalog.TableNotExistException e) {
             LOG.error("Table doesn't exist in given path.", e);
             System.err.println("Table doesn't exist in given path.");
diff --git a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/CompactAction.java b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/CompactAction.java
index d21d5e704..b525992b6 100644
--- a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/CompactAction.java
+++ b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/CompactAction.java
@@ -36,10 +36,12 @@ import org.apache.flink.table.data.RowData;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
+import java.util.Collections;
 import java.util.List;
 import java.util.Map;
 import java.util.Optional;
 
+import static org.apache.paimon.flink.action.Action.getConfigMap;
 import static org.apache.paimon.flink.action.Action.getPartitions;
 import static org.apache.paimon.flink.action.Action.getTablePath;
 
@@ -52,14 +54,18 @@ public class CompactAction extends ActionBase {
     private final CompactorSinkBuilder sinkBuilder;
 
     CompactAction(String warehouse, String database, String tableName) {
-        super(warehouse, database, tableName, new Options().set(CoreOptions.WRITE_ONLY, false));
+        this(warehouse, database, tableName, new Options());
+    }
+
+    CompactAction(String warehouse, String database, String tableName, Options catalogOptions) {
+        super(warehouse, database, tableName, catalogOptions);
         if (!(table instanceof FileStoreTable)) {
             throw new UnsupportedOperationException(
                     String.format(
                             "Only FileStoreTable supports compact action. The table type is '%s'.",
                             table.getClass().getName()));
         }
-
+        table = table.copy(Collections.singletonMap(CoreOptions.WRITE_ONLY.key(), "false"));
         sourceBuilder =
                 new CompactorSourceBuilder(identifier.getFullName(), (FileStoreTable) table);
         sinkBuilder = new CompactorSinkBuilder((FileStoreTable) table);
@@ -104,7 +110,12 @@ public class CompactAction extends ActionBase {
             return Optional.empty();
         }
 
-        CompactAction action = new CompactAction(tablePath.f0, tablePath.f1, tablePath.f2);
+        Optional<Map<String, String>> catalogConfigOption = getConfigMap(params, "catalog-conf");
+        Options catalogOptions =
+                Options.fromMap(catalogConfigOption.orElse(Collections.emptyMap()));
+
+        CompactAction action =
+                new CompactAction(tablePath.f0, tablePath.f1, tablePath.f2, catalogOptions);
 
         if (params.has("partition")) {
             List<Map<String, String>> partitions = getPartitions(params);
@@ -127,6 +138,9 @@ public class CompactAction extends ActionBase {
         System.out.println(
                 "  compact --warehouse <warehouse-path> --database <database-name> "
                         + "--table <table-name> [--partition <partition-name>]");
+        System.out.println(
+                "  compact --warehouse s3://path/to/warehouse --database <database-name> "
+                        + "--table <table-name> [--catalog-conf <paimon-catalog-conf> [--catalog-conf <paimon-catalog-conf> ...]]");
         System.out.println("  compact --path <table-path> [--partition <partition-name>]");
         System.out.println();
 
@@ -142,6 +156,13 @@ public class CompactAction extends ActionBase {
         System.out.println(
                 "  compact --warehouse hdfs:///path/to/warehouse --database test_db --table test_table "
                         + "--partition dt=20221126,hh=08 --partition dt=20221127,hh=09");
+        System.out.println(
+                "  compact --warehouse s3:///path/to/warehouse "
+                        + "--database test_db "
+                        + "--table test_table "
+                        + "--catalog-conf s3.endpoint=https://****.com "
+                        + "--catalog-conf s3.access-key=***** "
+                        + "--catalog-conf s3.secret-key=***** ");
     }
 
     @Override
diff --git a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncDatabaseAction.java b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncDatabaseAction.java
index dcd2c9d5c..d39702383 100644
--- a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncDatabaseAction.java
+++ b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncDatabaseAction.java
@@ -50,12 +50,12 @@ import java.sql.ResultSet;
 import java.time.ZoneId;
 import java.util.ArrayList;
 import java.util.Collections;
-import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 import java.util.Optional;
 import java.util.regex.Pattern;
 
+import static org.apache.paimon.flink.action.Action.getConfigMap;
 import static org.apache.paimon.utils.Preconditions.checkArgument;
 
 /**
@@ -338,45 +338,22 @@ public class MySqlSyncDatabaseAction implements Action {
         String includingTables = params.get("including-tables");
         String excludingTables = params.get("excluding-tables");
 
-        Map<String, String> mySqlConfig = getConfigMap(params, "mysql-conf");
-        Map<String, String> catalogConfig = getConfigMap(params, "catalog-conf");
-        Map<String, String> tableConfig = getConfigMap(params, "table-conf");
-        if (mySqlConfig == null) {
-            return Optional.empty();
-        }
-
-        return Optional.of(
-                new MySqlSyncDatabaseAction(
-                        mySqlConfig,
-                        warehouse,
-                        database,
-                        ignoreIncompatible,
-                        tablePrefix,
-                        tableSuffix,
-                        includingTables,
-                        excludingTables,
-                        catalogConfig == null ? Collections.emptyMap() : catalogConfig,
-                        tableConfig == null ? Collections.emptyMap() : tableConfig));
-    }
-
-    private static Map<String, String> getConfigMap(MultipleParameterTool params, String key) {
-        if (!params.has(key)) {
-            return null;
-        }
-
-        Map<String, String> map = new HashMap<>();
-        for (String param : params.getMultiParameter(key)) {
-            String[] kv = param.split("=");
-            if (kv.length == 2) {
-                map.put(kv[0], kv[1]);
-                continue;
-            }
-
-            System.err.println(
-                    "Invalid " + key + " " + param + ".\nRun mysql-sync-database --help for help.");
-            return null;
-        }
-        return map;
+        Optional<Map<String, String>> mySqlConfigOption = getConfigMap(params, "mysql-conf");
+        Optional<Map<String, String>> catalogConfigOption = getConfigMap(params, "catalog-conf");
+        Optional<Map<String, String>> tableConfigOption = getConfigMap(params, "table-conf");
+        return mySqlConfigOption.map(
+                mySqlConfig ->
+                        new MySqlSyncDatabaseAction(
+                                mySqlConfig,
+                                warehouse,
+                                database,
+                                ignoreIncompatible,
+                                tablePrefix,
+                                tableSuffix,
+                                includingTables,
+                                excludingTables,
+                                catalogConfigOption.orElse(Collections.emptyMap()),
+                                tableConfigOption.orElse(Collections.emptyMap())));
     }
 
     private static void printHelp() {


[incubator-paimon] 04/10: [cdc] add set type for mysql cdc action (#1065)

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit 3091d93df3f071231a7644d6aca9d0d3ab844faf
Author: JunZhang <zh...@126.com>
AuthorDate: Fri May 5 15:39:59 2023 +0800

    [cdc] add set type for mysql cdc action (#1065)
---
 .../java/org/apache/paimon/utils/TypeUtils.java    | 22 ++++++++++++++++++++++
 .../org/apache/paimon/data/DataFormatTestUtil.java | 12 ++++++++++++
 .../cdc/mysql/MySqlSyncTableActionITCase.java      |  8 ++++++--
 .../src/test/resources/mysql/setup.sql             |  7 +++++--
 4 files changed, 45 insertions(+), 4 deletions(-)

diff --git a/paimon-common/src/main/java/org/apache/paimon/utils/TypeUtils.java b/paimon-common/src/main/java/org/apache/paimon/utils/TypeUtils.java
index b4ed5bded..b853d4881 100644
--- a/paimon-common/src/main/java/org/apache/paimon/utils/TypeUtils.java
+++ b/paimon-common/src/main/java/org/apache/paimon/utils/TypeUtils.java
@@ -20,7 +20,9 @@ package org.apache.paimon.utils;
 
 import org.apache.paimon.data.BinaryString;
 import org.apache.paimon.data.Decimal;
+import org.apache.paimon.data.GenericArray;
 import org.apache.paimon.data.Timestamp;
+import org.apache.paimon.types.ArrayType;
 import org.apache.paimon.types.DataField;
 import org.apache.paimon.types.DataType;
 import org.apache.paimon.types.DataTypeChecks;
@@ -29,6 +31,7 @@ import org.apache.paimon.types.DecimalType;
 import org.apache.paimon.types.LocalZonedTimestampType;
 import org.apache.paimon.types.RowType;
 import org.apache.paimon.types.TimestampType;
+import org.apache.paimon.types.VarCharType;
 
 import java.math.BigDecimal;
 import java.time.DateTimeException;
@@ -115,6 +118,25 @@ public class TypeUtils {
             case TIMESTAMP_WITHOUT_TIME_ZONE:
                 TimestampType timestampType = (TimestampType) type;
                 return toTimestamp(str, timestampType.getPrecision());
+            case ARRAY:
+                ArrayType arrayType = (ArrayType) type;
+                DataType elementType = arrayType.getElementType();
+                if (elementType instanceof VarCharType) {
+                    if (s.startsWith("[")) {
+                        s = s.substring(1);
+                    }
+                    if (s.endsWith("]")) {
+                        s = s.substring(0, s.length() - 1);
+                    }
+                    String[] ss = s.split(",");
+                    BinaryString[] binaryStrings = new BinaryString[ss.length];
+                    for (int i = 0; i < ss.length; i++) {
+                        binaryStrings[i] = BinaryString.fromString(ss[i]);
+                    }
+                    return new GenericArray(binaryStrings);
+                } else {
+                    throw new UnsupportedOperationException("Unsupported type " + type);
+                }
             default:
                 throw new UnsupportedOperationException("Unsupported type " + type);
         }
diff --git a/paimon-common/src/test/java/org/apache/paimon/data/DataFormatTestUtil.java b/paimon-common/src/test/java/org/apache/paimon/data/DataFormatTestUtil.java
index 11a08583e..b8b5f9f6d 100644
--- a/paimon-common/src/test/java/org/apache/paimon/data/DataFormatTestUtil.java
+++ b/paimon-common/src/test/java/org/apache/paimon/data/DataFormatTestUtil.java
@@ -18,6 +18,7 @@
 package org.apache.paimon.data;
 
 import org.apache.paimon.memory.MemorySegment;
+import org.apache.paimon.types.ArrayType;
 import org.apache.paimon.types.RowType;
 import org.apache.paimon.utils.StringUtils;
 
@@ -43,6 +44,17 @@ public class DataFormatTestUtil {
                 Object field = fieldGetter.getFieldOrNull(row);
                 if (field instanceof byte[]) {
                     build.append(Arrays.toString((byte[]) field));
+                } else if (field instanceof InternalArray) {
+                    InternalArray internalArray = (InternalArray) field;
+                    ArrayType arrayType = (ArrayType) type.getTypeAt(i);
+                    InternalArray.ElementGetter elementGetter =
+                            InternalArray.createElementGetter(arrayType.getElementType());
+                    String[] result = new String[internalArray.size()];
+                    for (int j = 0; j < internalArray.size(); j++) {
+                        Object object = elementGetter.getElementOrNull(internalArray, j);
+                        result[j] = null == object ? null : object.toString();
+                    }
+                    build.append(Arrays.toString(result));
                 } else {
                     build.append(field);
                 }
diff --git a/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java b/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java
index db8a93c21..9a5b21f44 100644
--- a/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java
+++ b/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java
@@ -487,7 +487,8 @@ public class MySqlSyncTableActionITCase extends MySqlActionITCaseBase {
                             DataTypes.STRING(), // _multipoint
                             DataTypes.STRING(), // _multiline
                             DataTypes.STRING(), // _multipolygon
-                            DataTypes.STRING() // _geometrycollection
+                            DataTypes.STRING(), // _geometrycollection
+                            DataTypes.ARRAY(DataTypes.STRING()) // _set
                         },
                         new String[] {
                             "_id",
@@ -564,6 +565,7 @@ public class MySqlSyncTableActionITCase extends MySqlActionITCaseBase {
                             "_multiline",
                             "_multipolygon",
                             "_geometrycollection",
+                            "_set",
                         });
         FileStoreTable table = getFileStoreTable();
         List<String> expected =
@@ -609,7 +611,8 @@ public class MySqlSyncTableActionITCase extends MySqlActionITCaseBase {
                                 + "{\"coordinates\":[[1,1],[2,2]],\"type\":\"MultiPoint\",\"srid\":0}, "
                                 + "{\"coordinates\":[[[1,1],[2,2],[3,3]],[[4,4],[5,5]]],\"type\":\"MultiLineString\",\"srid\":0}, "
                                 + "{\"coordinates\":[[[[0,0],[10,0],[10,10],[0,10],[0,0]]],[[[5,5],[7,5],[7,7],[5,7],[5,5]]]],\"type\":\"MultiPolygon\",\"srid\":0}, "
-                                + "{\"geometries\":[{\"type\":\"Point\",\"coordinates\":[10,10]},{\"type\":\"Point\",\"coordinates\":[30,30]},{\"type\":\"LineString\",\"coordinates\":[[15,15],[20,20]]}],\"type\":\"GeometryCollection\",\"srid\":0}"
+                                + "{\"geometries\":[{\"type\":\"Point\",\"coordinates\":[10,10]},{\"type\":\"Point\",\"coordinates\":[30,30]},{\"type\":\"LineString\",\"coordinates\":[[15,15],[20,20]]}],\"type\":\"GeometryCollection\",\"srid\":0}, "
+                                + "[a, b]"
                                 + "]",
                         "+I["
                                 + "2, 2.2, "
@@ -642,6 +645,7 @@ public class MySqlSyncTableActionITCase extends MySqlActionITCaseBase {
                                 + "NULL, "
                                 + "NULL, "
                                 + "NULL, "
+                                + "NULL, "
                                 + "NULL"
                                 + "]");
         waitForResult(expected, table, rowType, Arrays.asList("pt", "_id"));
diff --git a/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql b/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql
index a3ee9a113..dc4a6cbe7 100644
--- a/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql
+++ b/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql
@@ -145,6 +145,7 @@ CREATE TABLE all_types_table (
     _multiline  MULTILINESTRING,
     _multipolygon  MULTIPOLYGON,
     _geometrycollection GEOMETRYCOLLECTION,
+    _set SET('a', 'b', 'c', 'd'),
     PRIMARY KEY (_id)
 );
 
@@ -201,8 +202,9 @@ INSERT INTO all_types_table VALUES (
     ST_GeomFromText('POLYGON((1 1, 2 1, 2 2,  1 2, 1 1))'),
     ST_GeomFromText('MULTIPOINT((1 1),(2 2))'),
     ST_GeomFromText('MultiLineString((1 1,2 2,3 3),(4 4,5 5))'),
-    ST_GeomFromText('MULTIPOLYGON(((0 0,10 0,10 10,0 10,0 0)),((5 5,7 5,7 7,5 7, 5 5)))'),
-    ST_GeomFromText('GEOMETRYCOLLECTION(POINT(10 10), POINT(30 30), LINESTRING(15 15, 20 20))')
+    ST_GeomFromText('MULTIPOLYGON(((0 0, 10 0, 10 10, 0 10, 0 0)), ((5 5, 7 5, 7 7, 5 7, 5 5)))'),
+    ST_GeomFromText('GEOMETRYCOLLECTION(POINT(10 10), POINT(30 30), LINESTRING(15 15, 20 20))'),
+    'a,b'
 ), (
     2, 2.2,
     NULL, NULL, NULL, NULL, NULL, NULL,
@@ -234,6 +236,7 @@ INSERT INTO all_types_table VALUES (
     NULL,
     NULL,
     NULL,
+    NULL,
     NULL
 );
 


[incubator-paimon] 07/10: [doc] Add hadoop prefix option configuration way

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit 611e97aaf780fdaa74a3d1f1e7fa677b29c3f3a3
Author: JingsongLi <lz...@aliyun.com>
AuthorDate: Mon May 8 18:20:11 2023 +0800

    [doc] Add hadoop prefix option configuration way
---
 docs/content/filesystems/hdfs.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/docs/content/filesystems/hdfs.md b/docs/content/filesystems/hdfs.md
index c05c17cd7..1d99d973f 100644
--- a/docs/content/filesystems/hdfs.md
+++ b/docs/content/filesystems/hdfs.md
@@ -41,6 +41,7 @@ configure your HDFS:
 
 1. Set environment variable `HADOOP_HOME` or `HADOOP_CONF_DIR`.
 2. Configure `'hadoop-conf-dir'` in the paimon catalog.
+3. Configure Hadoop options through prefix `'hadoop.'` in the paimon catalog.
 
 The first approach is recommended.
 


[incubator-paimon] 08/10: [core] Add addColumn refactoring method for SchemaChange class (#981)

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit b2d67c6711ec94b24eed32c4a20f8fd11b9ec5f8
Author: s7monk <34...@users.noreply.github.com>
AuthorDate: Tue May 9 11:19:47 2023 +0800

    [core] Add addColumn refactoring method for SchemaChange class (#981)
---
 .../src/main/java/org/apache/paimon/schema/SchemaChange.java   |  4 ++++
 .../test/java/org/apache/paimon/catalog/CatalogTestBase.java   | 10 ++++++++--
 2 files changed, 12 insertions(+), 2 deletions(-)

diff --git a/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java b/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java
index ff30d48ef..574a3c8f2 100644
--- a/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java
+++ b/paimon-core/src/main/java/org/apache/paimon/schema/SchemaChange.java
@@ -47,6 +47,10 @@ public interface SchemaChange extends Serializable {
         return addColumn(fieldName, dataType, null, null);
     }
 
+    static SchemaChange addColumn(String fieldName, DataType dataType, String comment) {
+        return new AddColumn(fieldName, dataType, comment, null);
+    }
+
     static SchemaChange addColumn(String fieldName, DataType dataType, String comment, Move move) {
         return new AddColumn(fieldName, dataType, comment, move);
     }
diff --git a/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java b/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java
index 684d0d675..f2cb60c19 100644
--- a/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java
+++ b/paimon-core/src/test/java/org/apache/paimon/catalog/CatalogTestBase.java
@@ -409,13 +409,19 @@ public abstract class CatalogTestBase {
                 false);
         catalog.alterTable(
                 identifier,
-                Lists.newArrayList(SchemaChange.addColumn("col2", DataTypes.DATE())),
+                Lists.newArrayList(
+                        SchemaChange.addColumn("col2", DataTypes.DATE()),
+                        SchemaChange.addColumn("col3", DataTypes.STRING(), "col3 field")),
                 false);
         Table table = catalog.getTable(identifier);
-        assertThat(table.rowType().getFields()).hasSize(2);
+        assertThat(table.rowType().getFields()).hasSize(3);
         int index = table.rowType().getFieldIndex("col2");
+        int index2 = table.rowType().getFieldIndex("col3");
         assertThat(index).isEqualTo(1);
+        assertThat(index2).isEqualTo(2);
         assertThat(table.rowType().getTypeAt(index)).isEqualTo(DataTypes.DATE());
+        assertThat(table.rowType().getTypeAt(index2)).isEqualTo(DataTypes.STRING());
+        assertThat(table.rowType().getFields().get(2).description()).isEqualTo("col3 field");
 
         // Alter table throws Exception when table is system table
         assertThatExceptionOfType(IllegalArgumentException.class)


[incubator-paimon] 02/10: [core] Add validation in Schema#Builder (#1030)

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit 3925b0dd54dc7df6d0e38dbd3078cd2f55198d78
Author: yuzelin <33...@users.noreply.github.com>
AuthorDate: Fri May 5 13:47:21 2023 +0800

    [core] Add validation in Schema#Builder (#1030)
---
 .../main/java/org/apache/paimon/schema/Schema.java | 28 +++++++++
 .../apache/paimon/schema/SchemaBuilderTest.java    | 67 ++++++++++++++++++++++
 2 files changed, 95 insertions(+)

diff --git a/paimon-core/src/main/java/org/apache/paimon/schema/Schema.java b/paimon-core/src/main/java/org/apache/paimon/schema/Schema.java
index cd90a5d51..cecec0f58 100644
--- a/paimon-core/src/main/java/org/apache/paimon/schema/Schema.java
+++ b/paimon-core/src/main/java/org/apache/paimon/schema/Schema.java
@@ -28,6 +28,7 @@ import javax.annotation.Nullable;
 
 import java.util.ArrayList;
 import java.util.Arrays;
+import java.util.Collections;
 import java.util.HashMap;
 import java.util.HashSet;
 import java.util.List;
@@ -94,7 +95,22 @@ public class Schema {
     private static List<DataField> normalizeFields(
             List<DataField> fields, List<String> primaryKeys, List<String> partitionKeys) {
         List<String> fieldNames = fields.stream().map(DataField::name).collect(Collectors.toList());
+
+        Set<String> duplicateColumns = duplicate(fieldNames);
+        Preconditions.checkState(
+                duplicateColumns.isEmpty(),
+                "Table column %s must not contain duplicate fields. Found: %s",
+                fieldNames,
+                duplicateColumns);
+
         Set<String> allFields = new HashSet<>(fieldNames);
+
+        duplicateColumns = duplicate(partitionKeys);
+        Preconditions.checkState(
+                duplicateColumns.isEmpty(),
+                "Partition key constraint %s must not contain duplicate columns. Found: %s",
+                partitionKeys,
+                duplicateColumns);
         Preconditions.checkState(
                 allFields.containsAll(partitionKeys),
                 "Table column %s should include all partition fields %s",
@@ -104,6 +120,12 @@ public class Schema {
         if (primaryKeys.isEmpty()) {
             return fields;
         }
+        duplicateColumns = duplicate(primaryKeys);
+        Preconditions.checkState(
+                duplicateColumns.isEmpty(),
+                "Primary key constraint %s must not contain duplicate columns. Found: %s",
+                primaryKeys,
+                duplicateColumns);
         Preconditions.checkState(
                 allFields.containsAll(primaryKeys),
                 "Table column %s should include all primary key constraint %s",
@@ -133,6 +155,12 @@ public class Schema {
         return newFields;
     }
 
+    private static Set<String> duplicate(List<String> names) {
+        return names.stream()
+                .filter(name -> Collections.frequency(names, name) > 1)
+                .collect(Collectors.toSet());
+    }
+
     @Override
     public boolean equals(Object o) {
         if (this == o) {
diff --git a/paimon-core/src/test/java/org/apache/paimon/schema/SchemaBuilderTest.java b/paimon-core/src/test/java/org/apache/paimon/schema/SchemaBuilderTest.java
new file mode 100644
index 000000000..864b77b67
--- /dev/null
+++ b/paimon-core/src/test/java/org/apache/paimon/schema/SchemaBuilderTest.java
@@ -0,0 +1,67 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.paimon.schema;
+
+import org.apache.paimon.testutils.assertj.AssertionUtils;
+import org.apache.paimon.types.DataTypes;
+
+import org.junit.jupiter.api.Test;
+
+import static org.assertj.core.api.AssertionsForClassTypes.assertThatThrownBy;
+
+/** Test for {@link Schema.Builder}. */
+public class SchemaBuilderTest {
+
+    @Test
+    public void testDuplicateColumns() {
+        Schema.Builder builder =
+                Schema.newBuilder().column("id", DataTypes.INT()).column("id", DataTypes.INT());
+
+        assertThatThrownBy(builder::build)
+                .satisfies(
+                        AssertionUtils.anyCauseMatches(
+                                IllegalStateException.class,
+                                "Table column [id, id] must not contain duplicate fields. Found: [id]"));
+    }
+
+    @Test
+    public void testDuplicatePrimaryKeys() {
+        Schema.Builder builder =
+                Schema.newBuilder().column("id", DataTypes.INT()).primaryKey("id", "id");
+
+        assertThatThrownBy(builder::build)
+                .satisfies(
+                        AssertionUtils.anyCauseMatches(
+                                IllegalStateException.class,
+                                "Primary key constraint [id, id] must not contain duplicate columns. Found: [id]"));
+    }
+
+    @Test
+    public void testDuplicatePartitionKeys() {
+        Schema.Builder builder =
+                Schema.newBuilder().column("id", DataTypes.INT()).partitionKeys("id", "id");
+
+        assertThatThrownBy(builder::build)
+                .satisfies(
+                        AssertionUtils.anyCauseMatches(
+                                IllegalStateException.class,
+                                "Partition key constraint [id, id] must not contain duplicate columns. Found: [id]"
+                                        + ""));
+    }
+}


[incubator-paimon] 03/10: [Improve] add column comment for mysql cdc create table. (#1055)

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit ba2079ee8044e29645e9d847f2c17fb3ef9673c6
Author: Guangdong Liu <li...@gmail.com>
AuthorDate: Fri May 5 15:17:08 2023 +0800

    [Improve] add column comment for mysql cdc create table. (#1055)
---
 .../paimon/tests/cdc/MySqlIgnoreCaseE2EeTest.java  |  4 ++--
 .../flink/action/cdc/mysql/MySqlActionUtils.java   |  9 +++++----
 .../paimon/flink/action/cdc/mysql/MySqlSchema.java | 23 ++++++++++++++--------
 .../cdc/mysql/MySqlSyncTableActionITCase.java      | 15 ++++++++++++++
 .../src/test/resources/mysql/setup.sql             | 20 +++++++++----------
 5 files changed, 47 insertions(+), 24 deletions(-)

diff --git a/paimon-e2e-tests/src/test/java/org/apache/paimon/tests/cdc/MySqlIgnoreCaseE2EeTest.java b/paimon-e2e-tests/src/test/java/org/apache/paimon/tests/cdc/MySqlIgnoreCaseE2EeTest.java
index c61ad44cc..3ed5595e9 100644
--- a/paimon-e2e-tests/src/test/java/org/apache/paimon/tests/cdc/MySqlIgnoreCaseE2EeTest.java
+++ b/paimon-e2e-tests/src/test/java/org/apache/paimon/tests/cdc/MySqlIgnoreCaseE2EeTest.java
@@ -128,8 +128,8 @@ public class MySqlIgnoreCaseE2EeTest extends MySqlCdcE2eTestBase {
                         createResultSink("result1", "fields STRING"));
 
         checkResult(
-                "[{\"id\":0,\"name\":\"k\",\"type\":\"INT NOT NULL\"},"
-                        + "{\"id\":1,\"name\":\"uppercase_v0\",\"type\":\"VARCHAR(20)\"}]");
+                "[{\"id\":0,\"name\":\"k\",\"type\":\"INT NOT NULL\",\"description\":\"\"},"
+                        + "{\"id\":1,\"name\":\"uppercase_v0\",\"type\":\"VARCHAR(20)\",\"description\":\"\"}]");
         clearCurrentResults();
         cancelJob(jobId);
     }
diff --git a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlActionUtils.java b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlActionUtils.java
index bbbaaeb26..028b742c2 100644
--- a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlActionUtils.java
+++ b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlActionUtils.java
@@ -32,6 +32,7 @@ import com.ververica.cdc.connectors.mysql.table.JdbcUrlUtils;
 import com.ververica.cdc.connectors.mysql.table.StartupOptions;
 import com.ververica.cdc.debezium.JsonDebeziumDeserializationSchema;
 import com.ververica.cdc.debezium.table.DebeziumOptions;
+import org.apache.flink.api.java.tuple.Tuple2;
 import org.apache.flink.configuration.Configuration;
 import org.apache.kafka.connect.json.JsonConverterConfig;
 
@@ -66,13 +67,13 @@ class MySqlActionUtils {
     }
 
     static boolean schemaCompatible(TableSchema tableSchema, MySqlSchema mySqlSchema) {
-        for (Map.Entry<String, DataType> entry : mySqlSchema.fields().entrySet()) {
+        for (Map.Entry<String, Tuple2<DataType, String>> entry : mySqlSchema.fields().entrySet()) {
             int idx = tableSchema.fieldNames().indexOf(entry.getKey());
             if (idx < 0) {
                 return false;
             }
             DataType type = tableSchema.fields().get(idx).type();
-            if (UpdatedDataFieldsProcessFunction.canConvert(entry.getValue(), type)
+            if (UpdatedDataFieldsProcessFunction.canConvert(entry.getValue().f0, type)
                     != UpdatedDataFieldsProcessFunction.ConvertAction.CONVERT) {
                 return false;
             }
@@ -88,8 +89,8 @@ class MySqlActionUtils {
         Schema.Builder builder = Schema.newBuilder();
         builder.options(paimonConfig);
 
-        for (Map.Entry<String, DataType> entry : mySqlSchema.fields().entrySet()) {
-            builder.column(entry.getKey(), entry.getValue());
+        for (Map.Entry<String, Tuple2<DataType, String>> entry : mySqlSchema.fields().entrySet()) {
+            builder.column(entry.getKey(), entry.getValue().f0, entry.getValue().f1);
         }
 
         if (specifiedPrimaryKeys.size() > 0) {
diff --git a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSchema.java b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSchema.java
index 6b5c2831e..2b390b81e 100644
--- a/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSchema.java
+++ b/paimon-flink/paimon-flink-common/src/main/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSchema.java
@@ -21,6 +21,8 @@ package org.apache.paimon.flink.action.cdc.mysql;
 import org.apache.paimon.flink.sink.cdc.UpdatedDataFieldsProcessFunction;
 import org.apache.paimon.types.DataType;
 
+import org.apache.flink.api.java.tuple.Tuple2;
+
 import java.sql.DatabaseMetaData;
 import java.sql.ResultSet;
 import java.util.ArrayList;
@@ -36,7 +38,7 @@ public class MySqlSchema {
     private final String databaseName;
     private final String tableName;
 
-    private final LinkedHashMap<String, DataType> fields;
+    private final LinkedHashMap<String, Tuple2<DataType, String>> fields;
     private final List<String> primaryKeys;
 
     public MySqlSchema(
@@ -51,6 +53,7 @@ public class MySqlSchema {
                 String fieldName = rs.getString("COLUMN_NAME");
                 String fieldType = rs.getString("TYPE_NAME");
                 Integer precision = rs.getInt("COLUMN_SIZE");
+                String fieldComment = rs.getString("REMARKS");
 
                 if (rs.wasNull()) {
                     precision = null;
@@ -67,7 +70,11 @@ public class MySqlSchema {
                                     fieldName, databaseName, tableName));
                     fieldName = fieldName.toLowerCase();
                 }
-                fields.put(fieldName, MySqlTypeUtils.toDataType(fieldType, precision, scale));
+                fields.put(
+                        fieldName,
+                        Tuple2.of(
+                                MySqlTypeUtils.toDataType(fieldType, precision, scale),
+                                fieldComment));
             }
         }
 
@@ -91,7 +98,7 @@ public class MySqlSchema {
         return tableName;
     }
 
-    public Map<String, DataType> fields() {
+    public Map<String, Tuple2<DataType, String>> fields() {
         return fields;
     }
 
@@ -100,14 +107,14 @@ public class MySqlSchema {
     }
 
     public MySqlSchema merge(MySqlSchema other) {
-        for (Map.Entry<String, DataType> entry : other.fields.entrySet()) {
+        for (Map.Entry<String, Tuple2<DataType, String>> entry : other.fields.entrySet()) {
             String fieldName = entry.getKey();
-            DataType newType = entry.getValue();
+            DataType newType = entry.getValue().f0;
             if (fields.containsKey(fieldName)) {
-                DataType oldType = fields.get(fieldName);
+                DataType oldType = fields.get(fieldName).f0;
                 switch (UpdatedDataFieldsProcessFunction.canConvert(oldType, newType)) {
                     case CONVERT:
-                        fields.put(fieldName, newType);
+                        fields.put(fieldName, Tuple2.of(newType, entry.getValue().f1));
                         break;
                     case EXCEPTION:
                         throw new IllegalArgumentException(
@@ -120,7 +127,7 @@ public class MySqlSchema {
                                         other.tableName));
                 }
             } else {
-                fields.put(fieldName, newType);
+                fields.put(fieldName, Tuple2.of(newType, entry.getValue().f1));
             }
         }
         if (!primaryKeys.equals(other.primaryKeys)) {
diff --git a/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java b/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java
index d3e950d58..db8a93c21 100644
--- a/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java
+++ b/paimon-flink/paimon-flink-common/src/test/java/org/apache/paimon/flink/action/cdc/mysql/MySqlSyncTableActionITCase.java
@@ -30,6 +30,7 @@ import org.apache.paimon.types.DataField;
 import org.apache.paimon.types.DataType;
 import org.apache.paimon.types.DataTypes;
 import org.apache.paimon.types.RowType;
+import org.apache.paimon.utils.JsonSerdeUtil;
 
 import org.apache.flink.api.common.JobStatus;
 import org.apache.flink.api.common.restartstrategy.RestartStrategies;
@@ -51,6 +52,7 @@ import java.util.concurrent.ThreadLocalRandom;
 import java.util.stream.Collectors;
 
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.api.Assertions.assertEquals;
 import static org.junit.jupiter.api.Assertions.assertThrows;
 
 /** IT cases for {@link MySqlSyncTableAction}. */
@@ -95,6 +97,9 @@ public class MySqlSyncTableActionITCase extends MySqlActionITCaseBase {
             Thread.sleep(1000);
         }
 
+        checkTableSchema(
+                "[{\"id\":0,\"name\":\"pt\",\"type\":\"INT NOT NULL\",\"description\":\"primary\"},{\"id\":1,\"name\":\"_id\",\"type\":\"INT NOT NULL\",\"description\":\"_id\"},{\"id\":2,\"name\":\"v1\",\"type\":\"VARCHAR(10)\",\"description\":\"v1\"}]");
+
         try (Connection conn =
                 DriverManager.getConnection(
                         MYSQL_CONTAINER.getJdbcUrl(DATABASE_NAME),
@@ -106,6 +111,13 @@ public class MySqlSyncTableActionITCase extends MySqlActionITCaseBase {
         }
     }
 
+    private void checkTableSchema(String excepted) throws Exception {
+
+        FileStoreTable table = getFileStoreTable();
+
+        assertEquals(excepted, JsonSerdeUtil.toFlatJson(table.schema().fields()));
+    }
+
     private void testSchemaEvolutionImpl(Statement statement) throws Exception {
         FileStoreTable table = getFileStoreTable();
         statement.executeUpdate("USE paimon_sync_table");
@@ -280,6 +292,9 @@ public class MySqlSyncTableActionITCase extends MySqlActionITCaseBase {
             Thread.sleep(1000);
         }
 
+        checkTableSchema(
+                "[{\"id\":0,\"name\":\"_id\",\"type\":\"INT NOT NULL\",\"description\":\"primary\"},{\"id\":1,\"name\":\"v1\",\"type\":\"VARCHAR(10)\",\"description\":\"v1\"},{\"id\":2,\"name\":\"v2\",\"type\":\"INT\",\"description\":\"v2\"},{\"id\":3,\"name\":\"v3\",\"type\":\"VARCHAR(10)\",\"description\":\"v3\"}]");
+
         try (Connection conn =
                 DriverManager.getConnection(
                         MYSQL_CONTAINER.getJdbcUrl(DATABASE_NAME),
diff --git a/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql b/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql
index 580e13e3f..a3ee9a113 100644
--- a/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql
+++ b/paimon-flink/paimon-flink-common/src/test/resources/mysql/setup.sql
@@ -28,24 +28,24 @@ CREATE DATABASE paimon_sync_table;
 USE paimon_sync_table;
 
 CREATE TABLE schema_evolution_1 (
-    pt INT,
-    _id INT,
-    v1 VARCHAR(10),
+    pt INT comment  'primary',
+    _id INT comment  '_id',
+    v1 VARCHAR(10) comment  'v1',
     PRIMARY KEY (_id)
 );
 
 CREATE TABLE schema_evolution_2 (
-    pt INT,
-    _id INT,
-    v1 VARCHAR(10),
+    pt INT comment 'primary',
+    _id INT comment  '_id',
+    v1 VARCHAR(10) comment  'v1',
     PRIMARY KEY (_id)
 );
 
 CREATE TABLE schema_evolution_multiple (
-    _id INT,
-    v1 VARCHAR(10),
-    v2 INT,
-    v3 VARCHAR(10),
+    _id INT comment 'primary',
+    v1 VARCHAR(10) comment 'v1',
+    v2 INT comment 'v2',
+    v3 VARCHAR(10) comment 'v3',
     PRIMARY KEY (_id)
 );
 


[incubator-paimon] 01/10: [doc] Table is append-only if no primary key

Posted by lz...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.4
in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git

commit b0b513f07c1fd90879d506b2956aebbe079a2c75
Author: JingsongLi <lz...@aliyun.com>
AuthorDate: Thu May 4 16:00:11 2023 +0800

    [doc] Table is append-only if no primary key
---
 docs/content/concepts/append-only-table.md | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/docs/content/concepts/append-only-table.md b/docs/content/concepts/append-only-table.md
index b776995e6..2e483acee 100644
--- a/docs/content/concepts/append-only-table.md
+++ b/docs/content/concepts/append-only-table.md
@@ -26,7 +26,7 @@ under the License.
 
 # Append Only Table
 
-By specifying `'write-mode' = 'append-only'` when creating the table, user creates an append-only table.
+If a table does not have a primary key defined, it is an append-only table by default.
 
 You can only insert a complete record into the table. No delete or update is supported and you cannot define primary keys.
 This type of table is suitable for use cases that do not require updates (such as log data synchronization).
@@ -176,7 +176,6 @@ CREATE TABLE MyTable (
     price DOUBLE,
     sales BIGINT
 ) WITH (
-    'write-mode' = 'append-only',
     'bucket' = '8',
     'bucket-key' = 'product_id'
 );