You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by fa...@apache.org on 2022/06/02 10:34:25 UTC

[incubator-seatunnel] 01/01: Add ReleaseNotes of 2.1.2

This is an automated email from the ASF dual-hosted git repository.

fanjia pushed a commit to branch 2.1.2-release
in repository https://gitbox.apache.org/repos/asf/incubator-seatunnel.git

commit 15c5de3cb4ef4339307a7523e255ff61c84ee844
Author: fanjia <10...@qq.com>
AuthorDate: Thu Jun 2 18:26:55 2022 +0800

    Add ReleaseNotes of 2.1.2
---
 ReleaseNotes.md | 40 ++++++++++++++++++++++++++++++++++++++++
 1 file changed, 40 insertions(+)

diff --git a/ReleaseNotes.md b/ReleaseNotes.md
new file mode 100644
index 00000000..055283f5
--- /dev/null
+++ b/ReleaseNotes.md
@@ -0,0 +1,40 @@
+# 2.1.2 Release
+
+[Feature]
+- Add Spark webhook source
+- Support Flink application mode
+- Split connector jar from core jar
+- Add Replace transforms for Spark
+- Add Uuid transform for Spark
+- Support Flink dynamic configurations
+- Flink JDBC source support Oracle database
+- Add Flink connector Http
+- Add Flink transform for register user define function
+- Add Flink SQL Kafka, ElasticSearch connector
+
+[Bugfix]
+- Fixed ClickHouse sink data type convert error
+- Fixed first execute Spark start shell can't run problem
+- Fixed can not get config file when use Spark on yarn cluster mode
+- Fixed Spark extraJavaOptions can't be empty
+- Fixed the "plugins.tar.gz" decompression failure in Spark standalone cluster mode 
+- Fixed Clickhouse sink can not work correctly when use multiple hosts
+- Fixed Flink sql conf parse exception
+- Fixed Flink JDBC Mysql datatype mapping incomplete
+- Fixed variables cannot be set in Flink mode
+- Fixed SeaTunnel Flink engine cannot check source config
+
+[Improvement]
+- Update Jackson version to 2.12.6
+- Add guide on how to Set Up SeaTunnel with Kubernetes
+- Optimize some generic type code
+- Add Flink SQL e2e module
+- Flink JDBC connector add pre sql and post sql
+- Use @AutoService to generate SPI file
+- Support Flink FakeSourceStream to mock data
+- Support Read Hive by Flink JDBC source
+- ClickhouseFile support ReplicatedMergeTree
+- Support use Hive sink to save table as ORCFileFormat
+- Support Spark Redis sink custom expire time
+- Add Spark JDBC isolationLevel config
+- Use Jackson replace Fastjson 
\ No newline at end of file