You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2020/02/11 12:31:42 UTC

[GitHub] [flink] rkhachatryan opened a new pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

rkhachatryan opened a new pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061
 
 
   ## What is the purpose of the change
   
   1. expose existing sinks at the datastream level (for users, who don’t need exactly-once OR for whom upsert is a better option); existing API shouldn’t change (mostly table API)
   2. refactor current code for re-use in a new exactly once JDBC sink
   
   
   ## Brief change log
   
   *(for example:)*
     - *minor changes: (case JDBC -> Jdbc; parameter objects)*
     - *extract connection management from `OutputFormat` to new `JdbcConnectionProvider`*
     - *replace `Row` with type parameter*
     - *move Table API specific code (like Tuple2<Boolean, T>) to the appropriate classes*
     - *refactor tests*
     - *add DataStream API (facade class)*
     - *add docs*
   
   
   ## Verifying this change
   
   This change is already covered by existing tests, such as *JDBCFullTest, JDBCOutputFormatTest, JdbcTableOutputFormatTest*.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): no
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: yes
     - The serializers: no
     - The runtime per-record code paths (performance sensitive): yes
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: no
     - The S3 file system connector: no
   
   ## Documentation
   
     - Does this pull request introduce a new feature? yes
     - If yes, how is the feature documented? docs
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c48ad371b961435b465aa1fd879d9d08c7d27f81 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/151231245) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383682797
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcConnectionOptions.java
 ##########
 @@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC connection options.
+ */
+@PublicEvolving
+public class JdbcConnectionOptions implements Serializable {
+
+	private static final long serialVersionUID = 1L;
+
+	protected final String url;
+	protected final String driverName;
+	protected final String username;
+	protected final String password;
+
+	public JdbcConnectionOptions(String url, String driverName, String username, String password) {
 
 Review comment:
   Make this constructor as `private` ? Because we should encourage users to use builder. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u closed pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u closed pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061
 
 
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "FAILURE",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/151226586) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389771008
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/InsertOrUpdateJdbcExecutor.java
 ##########
 @@ -0,0 +1,124 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+
+import java.sql.Connection;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Function;
+
+/**
+ * {@link JdbcBatchStatementExecutor} that provides upsert semantics by updating row if it exists and inserting otherwise.
+ * Used in Table API.
+ */
+@Internal
+public final class InsertOrUpdateJdbcExecutor<R, K, V> implements JdbcBatchStatementExecutor<R> {
+
+	private final String existSQL;
+	private final String insertSQL;
+	private final String updateSQL;
+
+	private final JdbcStatementBuilder<K> existSetter;
+	private final JdbcStatementBuilder<V> insertSetter;
+	private final JdbcStatementBuilder<V> updateSetter;
+
+	private final Function<R, K> keyExtractor;
+	private final Function<R, V> valueMapper;
+
+	private transient PreparedStatement existStatement;
+	private transient PreparedStatement insertStatement;
+	private transient PreparedStatement updateStatement;
+	private transient Map<K, V> batch = new HashMap<>();
+
+	public InsertOrUpdateJdbcExecutor(
+		String existSQL,
 
 Review comment:
   What about one more tab on the arguments so that we can distinguish between the args and the body of the method?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383176061
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
 
 Review comment:
   Maybe uniformise the names, either use `runner` or `executor` for both the class (type) and the variable (instance). It is just easier to follow the code I think.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-595163817
 
 
   I've rebased the PR onto the latest master.
   @kl0u , @wuchong, could you please take a look?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389950670
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/JdbcBatchStatementExecutor.java
 ##########
 @@ -16,29 +16,27 @@
  * limitations under the License.
  */
 
-package org.apache.flink.api.java.io.jdbc.writer;
+package org.apache.flink.api.java.io.jdbc.executor;
 
-import org.apache.flink.api.java.tuple.Tuple2;
-import org.apache.flink.types.Row;
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
 
-import java.io.Serializable;
 import java.sql.Connection;
 import java.sql.SQLException;
+import java.util.function.Function;
 
 /**
- * JDBCWriter used to execute statements (e.g. INSERT, UPSERT, DELETE).
+ * Executes the given JDBC statement in batch for the accumulated records.
  */
-public interface JDBCWriter extends Serializable {
+@Internal
+public interface JdbcBatchStatementExecutor<T> {
 
 Review comment:
   On higher levels, it is wrapped because there the allowed exception is `IOException`, not `SQLException`. So we can only declare `throws IOException` without wrapping.
   
   But IMO `SQLException` is better here because:
   - caller can actually do something with it
   - it has an appropriate abstraction level
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151894618",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=6126",
       "triggerID" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/152619337",
       "triggerID" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1b44ded44256c4061c857970a97c4b71f8a6b67d Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/152619337) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=6126) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385565414
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcExecutionOptions.java
 ##########
 @@ -0,0 +1,92 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC sink batch options.
+ */
+@PublicEvolving
+public class JdbcExecutionOptions implements Serializable {
 
 Review comment:
   I agree, `JdbcExecutionOptions` isn't ideal, but other names aren't better.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148531545 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:b465793e72972b78ba0bb87921ac001fec959bb3 Status:PENDING URL:https://travis-ci.com/flink-ci/flink/builds/149292262 TriggerType:PUSH TriggerID:b465793e72972b78ba0bb87921ac001fec959bb3
   Hash:b465793e72972b78ba0bb87921ac001fec959bb3 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250 TriggerType:PUSH TriggerID:b465793e72972b78ba0bb87921ac001fec959bb3
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148531545) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088) 
   * b465793e72972b78ba0bb87921ac001fec959bb3 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/149292262) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-592456585
 
 
   Thanks a lot for reviewing @aljoscha  ,  @kl0u ,   @wuchong .
   I've addressed the issues and currently waiting for another PR to pass CI (didn't push yet).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385127113
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
 
 Review comment:
   Yes, I think that's a good idea.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389930253
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/InsertOrUpdateJdbcExecutor.java
 ##########
 @@ -0,0 +1,124 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+
+import java.sql.Connection;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Function;
+
+/**
+ * {@link JdbcBatchStatementExecutor} that provides upsert semantics by updating row if it exists and inserting otherwise.
+ * Used in Table API.
+ */
+@Internal
+public final class InsertOrUpdateJdbcExecutor<R, K, V> implements JdbcBatchStatementExecutor<R> {
+
+	private final String existSQL;
+	private final String insertSQL;
+	private final String updateSQL;
+
+	private final JdbcStatementBuilder<K> existSetter;
+	private final JdbcStatementBuilder<V> insertSetter;
+	private final JdbcStatementBuilder<V> updateSetter;
+
+	private final Function<R, K> keyExtractor;
+	private final Function<R, V> valueMapper;
+
+	private transient PreparedStatement existStatement;
+	private transient PreparedStatement insertStatement;
+	private transient PreparedStatement updateStatement;
+	private transient Map<K, V> batch = new HashMap<>();
+
+	public InsertOrUpdateJdbcExecutor(
+		String existSQL,
 
 Review comment:
   How do I always mess with indentation 🤦‍♂️

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389771463
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/InsertOrUpdateJdbcExecutor.java
 ##########
 @@ -0,0 +1,124 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+
+import java.sql.Connection;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Function;
+
+/**
+ * {@link JdbcBatchStatementExecutor} that provides upsert semantics by updating row if it exists and inserting otherwise.
+ * Used in Table API.
+ */
+@Internal
+public final class InsertOrUpdateJdbcExecutor<R, K, V> implements JdbcBatchStatementExecutor<R> {
+
+	private final String existSQL;
+	private final String insertSQL;
+	private final String updateSQL;
+
+	private final JdbcStatementBuilder<K> existSetter;
+	private final JdbcStatementBuilder<V> insertSetter;
+	private final JdbcStatementBuilder<V> updateSetter;
+
+	private final Function<R, K> keyExtractor;
+	private final Function<R, V> valueMapper;
+
+	private transient PreparedStatement existStatement;
+	private transient PreparedStatement insertStatement;
+	private transient PreparedStatement updateStatement;
+	private transient Map<K, V> batch = new HashMap<>();
+
+	public InsertOrUpdateJdbcExecutor(
+		String existSQL,
+		String insertSQL,
+		String updateSQL,
+		JdbcStatementBuilder<K> existSetter,
+		JdbcStatementBuilder<V> insertSetter,
+		JdbcStatementBuilder<V> updateSetter,
+		Function<R, K> keyExtractor,
+		Function<R, V> valueExtractor) {
+		this.existSQL = existSQL;
 
 Review comment:
   I think it makes sense to add null checks here so that the contracts of the class are explicit.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385141186
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
+						try {
+							flush();
+						} catch (Exception e) {
+							flushException = e;
+						}
+					}
+				}
+			}, executionOptions.getBatchIntervalMs(), executionOptions.getBatchIntervalMs(), TimeUnit.MILLISECONDS);
+		}
+	}
+
+	private void checkFlushException() {
+		if (flushException != null) {
+			throw new RuntimeException("Writing records to JDBC failed.", flushException);
+		}
+	}
+
+	@Override
+	public final synchronized void writeRecord(In record) {
+		checkFlushException();
+
+		try {
+			doWriteRecord(record);
+			batchCount++;
+			if (batchCount >= executionOptions.getBatchSize()) {
+				flush();
+			}
+		} catch (Exception e) {
+			throw new RuntimeException("Writing records to JDBC failed.", e);
+		}
+	}
+
+	void doWriteRecord(In record) throws SQLException {
+		jdbcStatementExecutor.process(jdbcRecordExtractor.apply(record));
+	}
+
+	@Override
+	public synchronized void flush() throws IOException {
+		checkFlushException();
+
+		for (int i = 1; i <= executionOptions.getMaxRetries(); i++) {
+			try {
+				attemptFlush();
 
 Review comment:
   Yes. I would also replace `Thread.sleep(1000 * i);` with just rescheduling (retaining blocking semantics of `write`).
   But I'm not sure if such changes should be in this PR.
   WDYT?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383695924
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/JdbcBatchStatementExecutor.java
 ##########
 @@ -0,0 +1,102 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.JdbcDmlOptions;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+import org.apache.flink.types.Row;
+
+import java.sql.Connection;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.function.Function;
+
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.getPrimaryKey;
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.setRecordToStatement;
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+/**
+ * JDBCWriter used to execute statements (e.g. INSERT, UPSERT, DELETE).
 
 Review comment:
   Please update the Javadoc. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383681913
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcConnectionOptions.java
 ##########
 @@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC connection options.
+ */
+@PublicEvolving
+public class JdbcConnectionOptions implements Serializable {
+
+	private static final long serialVersionUID = 1L;
+
+	protected final String url;
+	protected final String driverName;
+	protected final String username;
+	protected final String password;
 
 Review comment:
   Add a `@Nullable` annotation to `username` and `password` and use Optional for the getter method?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] aljoscha commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
aljoscha commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r382585012
 
 

 ##########
 File path: docs/redirects/jdbc.md
 ##########
 @@ -0,0 +1,24 @@
+---
 
 Review comment:
   The redirects are only there because previously the documentation structure was different. For new pages we don't need redirects.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383741446
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcSink.java
 ##########
 @@ -0,0 +1,63 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.streaming.api.functions.sink.SinkFunction;
+
+/**
+ * Facade to create JDBC {@link SinkFunction sinks}.
+ */
+@PublicEvolving
+public class JdbcSink {
 
 Review comment:
   Shall we expose a corresponding `JdbcSource`, or it will be introduced later?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148531545 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:b465793e72972b78ba0bb87921ac001fec959bb3 Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/149292262 TriggerType:PUSH TriggerID:b465793e72972b78ba0bb87921ac001fec959bb3
   Hash:b465793e72972b78ba0bb87921ac001fec959bb3 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250 TriggerType:PUSH TriggerID:b465793e72972b78ba0bb87921ac001fec959bb3
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148531545) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088) 
   * b465793e72972b78ba0bb87921ac001fec959bb3 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/149292262) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383726014
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JDBCOutputFormat.java
 ##########
 @@ -187,25 +199,20 @@ public JDBCOutputFormatBuilder setSqlTypes(int[] typesArray) {
 		 * @return Configured JDBCOutputFormat
 		 */
 		public JDBCOutputFormat finish() {
+			return new JDBCOutputFormat(new SimpleJdbcConnectionProvider(buildConnectionOptions()),
 
 Review comment:
   Please break each argument into a separate line, including the first one. 
   This is also mentioned in the Flink Code Style: https://flink.apache.org/contributing/code-style-and-quality-formatting.html#breaking-the-lines-of-too-long-statements
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383735190
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCTestCheckpoint.java
 ##########
 @@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+/**
+ * Holds id and indices of items in {@link JdbcTestFixture#TEST_DATA}.
+ */
+public class JDBCTestCheckpoint {
 
 Review comment:
   This class is never been used?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b465793e72972b78ba0bb87921ac001fec959bb3 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/149292262) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383683540
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcDmlOptions.java
 ##########
 @@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.java.io.jdbc.dialect.JDBCDialect;
+import org.apache.flink.util.Preconditions;
+
+import javax.annotation.Nullable;
+
+import java.util.stream.Stream;
+
+/**
+ * JDBC sink DML options.
+ */
+public class JdbcDmlOptions extends JdbcTypedQueryOptions {
+
+	private static final long serialVersionUID = 1L;
+
+	private final String[] fieldNames;
+	@Nullable
+	private final String[] keyFields;
+	private final String tableName;
+	private final JDBCDialect dialect;
+
+	public static JdbcDmlOptionsBuilder builder() {
+		return new JdbcDmlOptionsBuilder();
+	}
+
+	private JdbcDmlOptions(String tableName, JDBCDialect dialect, String[] fieldNames, int[] fieldTypes, String[] keyFields) {
+		super(fieldTypes);
+		this.tableName = Preconditions.checkNotNull(tableName, "table is empty");
+		this.dialect = Preconditions.checkNotNull(dialect, "dialect name is empty");
+		this.fieldNames = Preconditions.checkNotNull(fieldNames, "field names is empty");
+		this.keyFields = keyFields;
+	}
+
+	public String getTableName() {
+		return tableName;
+	}
+
+	public JDBCDialect getDialect() {
+		Preconditions.checkNotNull(dialect, "dialect not set");
 
 Review comment:
   We dont' need to check this, because it is verified in constructor. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385232876
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/KeyedBatchStatementExecutor.java
 ##########
 @@ -0,0 +1,79 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+
+import java.sql.Connection;
+import java.sql.PreparedStatement;
+import java.sql.SQLException;
+import java.util.HashSet;
+import java.util.Set;
+import java.util.function.Function;
+
+class KeyedBatchStatementExecutor<T, K> implements JdbcBatchStatementExecutor<T> {
+
+	private final String sql;
+	private final JdbcStatementBuilder<K> parameterSetter;
+	private final Function<T, K> keyExtractor;
+
+	private transient Set<K> batch = new HashSet<>();
+	private transient PreparedStatement st;
+
+	/**
+	 * Keep in mind object reuse: if it's on then key extractor may be required to return new object.
+	 */
+	KeyedBatchStatementExecutor(String sql, Function<T, K> keyExtractor, JdbcStatementBuilder<K> statementBuilder) {
+		this.parameterSetter = statementBuilder;
+		this.keyExtractor = keyExtractor;
+		this.sql = sql;
+	}
+
+	@Override
+	public void open(Connection connection) throws SQLException {
+		batch = new HashSet<>();
+		st = connection.prepareStatement(sql);
+	}
+
+	@Override
+	public void process(T record) {
+		batch.add(keyExtractor.apply(record));
+	}
+
+	@Override
+	public void executeBatch() throws SQLException {
+		if (!batch.isEmpty()) {
+			for (K entry : batch) {
+				parameterSetter.accept(st, entry);
 
 Review comment:
   You are right, thanks for pointing this out.
   I'll change it to use `SimpleBatchStatementExecutor`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b465793e72972b78ba0bb87921ac001fec959bb3 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/149292262) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250) 
   * cf15b1eae72eb65acbb1d3c145034b7005fac134 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385165114
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/JdbcBatchStatementExecutor.java
 ##########
 @@ -0,0 +1,102 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.JdbcDmlOptions;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+import org.apache.flink.types.Row;
+
+import java.sql.Connection;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.function.Function;
+
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.getPrimaryKey;
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.setRecordToStatement;
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+/**
+ * JDBCWriter used to execute statements (e.g. INSERT, UPSERT, DELETE).
+ */
+@Internal
+public interface JdbcBatchStatementExecutor<T> {
+
+	/**
+	 * Open the writer by JDBC Connection. It can create Statement from Connection.
+	 */
+	void open(Connection connection) throws SQLException;
+
+	void process(T record) throws SQLException;
 
 Review comment:
   Yes, I think it's a good idea.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389944898
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcDmlOptions.java
 ##########
 @@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.java.io.jdbc.dialect.JDBCDialect;
+import org.apache.flink.util.Preconditions;
+
+import javax.annotation.Nullable;
+
+import java.util.Optional;
+import java.util.stream.Stream;
+
+/**
+ * JDBC sink DML options.
+ */
+public class JdbcDmlOptions extends JdbcTypedQueryOptions {
+
+	private static final long serialVersionUID = 1L;
+
+	private final String[] fieldNames;
+	@Nullable
+	private final String[] keyFields;
+	private final String tableName;
+	private final JDBCDialect dialect;
+
+	public static JdbcDmlOptionsBuilder builder() {
+		return new JdbcDmlOptionsBuilder();
+	}
+
+	private JdbcDmlOptions(String tableName, JDBCDialect dialect, String[] fieldNames, int[] fieldTypes, String[] keyFields) {
+		super(fieldTypes);
+		this.tableName = Preconditions.checkNotNull(tableName, "table is empty");
+		this.dialect = Preconditions.checkNotNull(dialect, "dialect name is empty");
+		this.fieldNames = Preconditions.checkNotNull(fieldNames, "field names is empty");
+		this.keyFields = keyFields;
+	}
+
+	public String getTableName() {
+		return tableName;
+	}
+
+	public JDBCDialect getDialect() {
+		return dialect;
+	}
+
+	public String[] getFieldNames() {
+		return fieldNames;
+	}
+
+	public Optional<String[]> getKeyFields() {
+		return Optional.ofNullable(keyFields);
+	}
+
+	/**
+	 * JDBCUpsertOptionsBuilder.
+	 */
+	public static class JdbcDmlOptionsBuilder extends JDBCUpdateQueryOptionsBuilder<JdbcDmlOptionsBuilder> {
+		private String tableName;
+		private String[] fieldNames;
+		private String[] keyFields;
+		private JDBCDialect dialect;
+		private JDBCDialect customDialect;
 
 Review comment:
   Leftover from some previous iteration. 
   Removed, thanks.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148531545 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:PENDING URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148531545) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:PENDING URL:https://travis-ci.com/flink-ci/flink/builds/148531545 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:PENDING URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148531545) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383696132
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/JdbcBatchStatementExecutor.java
 ##########
 @@ -0,0 +1,102 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.JdbcDmlOptions;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+import org.apache.flink.types.Row;
+
+import java.sql.Connection;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.function.Function;
+
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.getPrimaryKey;
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.setRecordToStatement;
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+/**
+ * JDBCWriter used to execute statements (e.g. INSERT, UPSERT, DELETE).
+ */
+@Internal
+public interface JdbcBatchStatementExecutor<T> {
+
+	/**
+	 * Open the writer by JDBC Connection. It can create Statement from Connection.
+	 */
+	void open(Connection connection) throws SQLException;
+
+	void process(T record) throws SQLException;
 
 Review comment:
   What do you think to rename this method to `addBatch(T record)`? This will be more align with the interface class name, i.e. add into batch statements instead of "process" them now. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c48ad371b961435b465aa1fd879d9d08c7d27f81 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/151231245) 
   * 246a3cb4a62b59256ced746abfecfa18d5064745 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385168352
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCTestCheckpoint.java
 ##########
 @@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+/**
+ * Holds id and indices of items in {@link JdbcTestFixture#TEST_DATA}.
+ */
+public class JDBCTestCheckpoint {
 
 Review comment:
   Yes, this is a leftover after splitting the work.
   Removed it, thanks.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "FAILURE",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/151226586) 
   * c48ad371b961435b465aa1fd879d9d08c7d27f81 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383180184
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/dialect/JDBCDialects.java
 ##########
 @@ -152,7 +152,7 @@ public String quoteIdentifier(String identifier) {
 			return identifier;
 		}
 
-		@Override
+@Override
 
 Review comment:
   Probably a typo?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385133281
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
 
 Review comment:
   `close()` is also `synchronized` so these sections can't execute simultaneously.
   
   `volatile` seems unnecessary here; I didn't remove this modifier because I tried to keep the internal logic untouched where possible (performance-wise it doesn't matter since it's not in the hot path).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383731330
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JdbcE2eTest.java
 ##########
 @@ -0,0 +1,103 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.api.java.io.jdbc.JdbcTestFixture.TestEntry;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+
+import org.junit.Ignore;
+import org.junit.Test;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.sql.Types;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.List;
+
+import static org.apache.flink.api.java.io.jdbc.JdbcTestFixture.INPUT_TABLE;
+import static org.apache.flink.api.java.io.jdbc.JdbcTestFixture.INSERT_TEMPLATE;
+import static org.apache.flink.api.java.io.jdbc.JdbcTestFixture.TEST_DATA;
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Smoke tests for the {@link JdbcSink} and the underlying classes.
+ */
+public class JdbcE2eTest extends JDBCTestBase {
 
 Review comment:
   I would call this `JdbcITCase`, because we have a `flink-end-to-end-tests` module to put tests for end-to-end environment.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389775134
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcDmlOptions.java
 ##########
 @@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.java.io.jdbc.dialect.JDBCDialect;
+import org.apache.flink.util.Preconditions;
+
+import javax.annotation.Nullable;
+
+import java.util.Optional;
+import java.util.stream.Stream;
+
+/**
+ * JDBC sink DML options.
+ */
+public class JdbcDmlOptions extends JdbcTypedQueryOptions {
+
+	private static final long serialVersionUID = 1L;
+
+	private final String[] fieldNames;
+	@Nullable
+	private final String[] keyFields;
+	private final String tableName;
+	private final JDBCDialect dialect;
+
+	public static JdbcDmlOptionsBuilder builder() {
+		return new JdbcDmlOptionsBuilder();
+	}
+
+	private JdbcDmlOptions(String tableName, JDBCDialect dialect, String[] fieldNames, int[] fieldTypes, String[] keyFields) {
+		super(fieldTypes);
+		this.tableName = Preconditions.checkNotNull(tableName, "table is empty");
+		this.dialect = Preconditions.checkNotNull(dialect, "dialect name is empty");
+		this.fieldNames = Preconditions.checkNotNull(fieldNames, "field names is empty");
+		this.keyFields = keyFields;
+	}
+
+	public String getTableName() {
+		return tableName;
+	}
+
+	public JDBCDialect getDialect() {
+		return dialect;
+	}
+
+	public String[] getFieldNames() {
+		return fieldNames;
+	}
+
+	public Optional<String[]> getKeyFields() {
+		return Optional.ofNullable(keyFields);
+	}
+
+	/**
+	 * JDBCUpsertOptionsBuilder.
+	 */
+	public static class JdbcDmlOptionsBuilder extends JDBCUpdateQueryOptionsBuilder<JdbcDmlOptionsBuilder> {
+		private String tableName;
+		private String[] fieldNames;
+		private String[] keyFields;
+		private JDBCDialect dialect;
+		private JDBCDialect customDialect;
 
 Review comment:
   This seems to not be used anywhere.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385133281
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
 
 Review comment:
   `close()` is also `synchronized` so these sections can't execute simultaneously.
   
   `volatile` seems unnecessary here; I didn't remove this modifier because I tried to keep the internal logic untouched where possible (performance isn`t affected since it's not in the hot path).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:PENDING URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:PENDING URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:UNKNOWN URL:TBD TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389952855
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/JdbcBatchStatementExecutor.java
 ##########
 @@ -16,29 +16,27 @@
  * limitations under the License.
  */
 
-package org.apache.flink.api.java.io.jdbc.writer;
+package org.apache.flink.api.java.io.jdbc.executor;
 
-import org.apache.flink.api.java.tuple.Tuple2;
-import org.apache.flink.types.Row;
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
 
-import java.io.Serializable;
 import java.sql.Connection;
 import java.sql.SQLException;
+import java.util.function.Function;
 
 /**
- * JDBCWriter used to execute statements (e.g. INSERT, UPSERT, DELETE).
+ * Executes the given JDBC statement in batch for the accumulated records.
  */
-public interface JDBCWriter extends Serializable {
+@Internal
+public interface JdbcBatchStatementExecutor<T> {
 
 Review comment:
   After further looking at code, I noticed it was wrapped in one place into `RuntimeException` instead of `IOException` so I fixed wrapping.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385576739
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
+						try {
+							flush();
+						} catch (Exception e) {
+							flushException = e;
+						}
+					}
+				}
+			}, executionOptions.getBatchIntervalMs(), executionOptions.getBatchIntervalMs(), TimeUnit.MILLISECONDS);
+		}
+	}
+
+	private void checkFlushException() {
+		if (flushException != null) {
+			throw new RuntimeException("Writing records to JDBC failed.", flushException);
+		}
+	}
+
+	@Override
+	public final synchronized void writeRecord(In record) {
+		checkFlushException();
+
+		try {
+			doWriteRecord(record);
+			batchCount++;
+			if (batchCount >= executionOptions.getBatchSize()) {
+				flush();
+			}
+		} catch (Exception e) {
+			throw new RuntimeException("Writing records to JDBC failed.", e);
+		}
+	}
+
+	void doWriteRecord(In record) throws SQLException {
+		jdbcStatementExecutor.process(jdbcRecordExtractor.apply(record));
+	}
+
+	@Override
+	public synchronized void flush() throws IOException {
+		checkFlushException();
+
+		for (int i = 1; i <= executionOptions.getMaxRetries(); i++) {
+			try {
+				attemptFlush();
 
 Review comment:
   Yes, that's a good idea.
   Created a ticket for that: https://issues.apache.org/jira/browse/FLINK-16328

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383692265
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JDBCOutputFormat.java
 ##########
 @@ -37,25 +36,36 @@
  * @see Row
  * @see DriverManager
  */
-public class JDBCOutputFormat extends AbstractJDBCOutputFormat<Row> {
+/**
+ * @deprecated use {@link JdbcBatchingOutputFormat}
+ */
+@Deprecated
+public class JDBCOutputFormat extends AbstractJdbcOutputFormat<Row> {
 
 	private static final long serialVersionUID = 1L;
 
 	private static final Logger LOG = LoggerFactory.getLogger(JDBCOutputFormat.class);
 
-	private final String query;
-	private final int batchInterval;
-	private final int[] typesArray;
+	final JdbcInsertOptions insertOptions;
+	private final JdbcExecutionOptions batchOptions;
 
-	private PreparedStatement upload;
-	private int batchCount = 0;
+	private transient PreparedStatement upload;
+	private transient int batchCount = 0;
 
-	public JDBCOutputFormat(String username, String password, String drivername,
-			String dbURL, String query, int batchInterval, int[] typesArray) {
-		super(username, password, drivername, dbURL);
-		this.query = query;
-		this.batchInterval = batchInterval;
-		this.typesArray = typesArray;
+	/**
+	 * @deprecated use {@link #JDBCOutputFormat(JdbcConnectionProvider, JdbcInsertOptions, JdbcExecutionOptions)}}.
+	 */
+	@Deprecated
+	public JDBCOutputFormat(String username, String password, String drivername, String dbURL, String query, int batchInterval, int[] typesArray) {
+		this(new SimpleJdbcConnectionProvider(new JdbcConnectionOptions(dbURL, drivername, username, password)),
+				new JdbcInsertOptions(query, typesArray),
+				JdbcExecutionOptions.builder().withBatchSize(batchInterval).build());
+	}
+
+	public JDBCOutputFormat(JdbcConnectionProvider connectionProvider, JdbcInsertOptions insertOptions, JdbcExecutionOptions batchOptions) {
 
 Review comment:
   We should recommend users to use builders instead of public constructors. Make it `private`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383174903
 
 

 ##########
 File path: docs/dev/connectors/jdbc.md
 ##########
 @@ -0,0 +1,97 @@
+---
+title: "JDBC Connector"
+nav-title: JDBC
+nav-parent_id: connectors
+nav-pos: 9
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+* This will be replaced by the TOC
+{:toc}
+
+
+---
+title: "JDBC Connector"
+nav-title: JDBC
+nav-parent_id: connectors
+nav-pos: 9
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+* This will be replaced by the TOC
+{:toc}
+
+
+This connector provides a sink that writes data to a JDBC database.
+
+To use this it, add the following dependency to your project (along with your JDBC-driver):
 
 Review comment:
   Remove the "this".

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385486500
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -46,15 +47,15 @@
 		}
 	}
 
-	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	interface StatementExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
 
 Review comment:
   What about `StatementExecutorFactory`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:UNKNOWN URL:TBD TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "FAILURE",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * cf15b1eae72eb65acbb1d3c145034b7005fac134 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/150879144) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383713343
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/KeyedBatchStatementExecutor.java
 ##########
 @@ -0,0 +1,79 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+
+import java.sql.Connection;
+import java.sql.PreparedStatement;
+import java.sql.SQLException;
+import java.util.HashSet;
+import java.util.Set;
+import java.util.function.Function;
+
+class KeyedBatchStatementExecutor<T, K> implements JdbcBatchStatementExecutor<T> {
+
+	private final String sql;
+	private final JdbcStatementBuilder<K> parameterSetter;
+	private final Function<T, K> keyExtractor;
+
+	private transient Set<K> batch = new HashSet<>();
+	private transient PreparedStatement st;
+
+	/**
+	 * Keep in mind object reuse: if it's on then key extractor may be required to return new object.
+	 */
+	KeyedBatchStatementExecutor(String sql, Function<T, K> keyExtractor, JdbcStatementBuilder<K> statementBuilder) {
+		this.parameterSetter = statementBuilder;
+		this.keyExtractor = keyExtractor;
+		this.sql = sql;
+	}
+
+	@Override
+	public void open(Connection connection) throws SQLException {
+		batch = new HashSet<>();
+		st = connection.prepareStatement(sql);
+	}
+
+	@Override
+	public void process(T record) {
+		batch.add(keyExtractor.apply(record));
+	}
+
+	@Override
+	public void executeBatch() throws SQLException {
+		if (!batch.isEmpty()) {
+			for (K entry : batch) {
+				parameterSetter.accept(st, entry);
 
 Review comment:
   I noticed that Upsert statement also uses this executor. However, this executor only buffers the key data. But how to update values without buffering values?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385172382
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcSink.java
 ##########
 @@ -0,0 +1,63 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.streaming.api.functions.sink.SinkFunction;
+
+/**
+ * Facade to create JDBC {@link SinkFunction sinks}.
+ */
+@PublicEvolving
+public class JdbcSink {
 
 Review comment:
   This PR only targets JDBC sinks.
   For sources I think a similar refactoring would be required.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:PENDING URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389774439
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcDmlOptions.java
 ##########
 @@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.java.io.jdbc.dialect.JDBCDialect;
+import org.apache.flink.util.Preconditions;
+
+import javax.annotation.Nullable;
+
+import java.util.Optional;
+import java.util.stream.Stream;
+
+/**
+ * JDBC sink DML options.
+ */
+public class JdbcDmlOptions extends JdbcTypedQueryOptions {
+
+	private static final long serialVersionUID = 1L;
+
+	private final String[] fieldNames;
+	@Nullable
+	private final String[] keyFields;
+	private final String tableName;
+	private final JDBCDialect dialect;
+
+	public static JdbcDmlOptionsBuilder builder() {
+		return new JdbcDmlOptionsBuilder();
+	}
+
+	private JdbcDmlOptions(String tableName, JDBCDialect dialect, String[] fieldNames, int[] fieldTypes, String[] keyFields) {
+		super(fieldTypes);
+		this.tableName = Preconditions.checkNotNull(tableName, "table is empty");
+		this.dialect = Preconditions.checkNotNull(dialect, "dialect name is empty");
+		this.fieldNames = Preconditions.checkNotNull(fieldNames, "field names is empty");
+		this.keyFields = keyFields;
+	}
+
+	public String getTableName() {
+		return tableName;
+	}
+
+	public JDBCDialect getDialect() {
+		return dialect;
+	}
+
+	public String[] getFieldNames() {
+		return fieldNames;
+	}
+
+	public Optional<String[]> getKeyFields() {
+		return Optional.ofNullable(keyFields);
+	}
+
+	/**
+	 * JDBCUpsertOptionsBuilder.
 
 Review comment:
   Probably add sth like "A builder for..." I understand that this is kind of redundant but at least have sth more meaningful :P

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151894618",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=6126",
       "triggerID" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/152619337",
       "triggerID" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1b44ded44256c4061c857970a97c4b71f8a6b67d Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/152619337) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=6126) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385143302
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
+						try {
+							flush();
+						} catch (Exception e) {
+							flushException = e;
+						}
+					}
+				}
+			}, executionOptions.getBatchIntervalMs(), executionOptions.getBatchIntervalMs(), TimeUnit.MILLISECONDS);
+		}
+	}
+
+	private void checkFlushException() {
+		if (flushException != null) {
+			throw new RuntimeException("Writing records to JDBC failed.", flushException);
+		}
+	}
+
+	@Override
+	public final synchronized void writeRecord(In record) {
+		checkFlushException();
+
+		try {
+			doWriteRecord(record);
+			batchCount++;
+			if (batchCount >= executionOptions.getBatchSize()) {
+				flush();
+			}
+		} catch (Exception e) {
+			throw new RuntimeException("Writing records to JDBC failed.", e);
+		}
+	}
+
+	void doWriteRecord(In record) throws SQLException {
+		jdbcStatementExecutor.process(jdbcRecordExtractor.apply(record));
+	}
+
+	@Override
+	public synchronized void flush() throws IOException {
+		checkFlushException();
+
+		for (int i = 1; i <= executionOptions.getMaxRetries(); i++) {
+			try {
+				attemptFlush();
 
 Review comment:
   I agree that this seems to be more of a bug that deserves its own PR. Would you like to open a JIRA for this and assign it to you?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151894618",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 246a3cb4a62b59256ced746abfecfa18d5064745 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/151894618) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948) 
   * 1b44ded44256c4061c857970a97c4b71f8a6b67d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151894618",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=6126",
       "triggerID" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "status" : "PENDING",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/152619337",
       "triggerID" : "1b44ded44256c4061c857970a97c4b71f8a6b67d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 246a3cb4a62b59256ced746abfecfa18d5064745 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/151894618) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948) 
   * 1b44ded44256c4061c857970a97c4b71f8a6b67d Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/152619337) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=6126) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:PENDING URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:PENDING URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:UNKNOWN URL:TBD TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383732719
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JdbcE2eTest.java
 ##########
 @@ -0,0 +1,103 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.api.java.io.jdbc.JdbcTestFixture.TestEntry;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+
+import org.junit.Ignore;
+import org.junit.Test;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.sql.Types;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.List;
+
+import static org.apache.flink.api.java.io.jdbc.JdbcTestFixture.INPUT_TABLE;
+import static org.apache.flink.api.java.io.jdbc.JdbcTestFixture.INSERT_TEMPLATE;
+import static org.apache.flink.api.java.io.jdbc.JdbcTestFixture.TEST_DATA;
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Smoke tests for the {@link JdbcSink} and the underlying classes.
+ */
+public class JdbcE2eTest extends JDBCTestBase {
+
+	@Test
+	@Ignore
+	public void testInsert() throws Exception {
+		StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
+		env.setRestartStrategy(new RestartStrategies.NoRestartStrategyConfiguration());
+		env.setParallelism(1);
+		env
+				.fromElements(TEST_DATA)
+				.addSink(JdbcSink.sink(
 
 Review comment:
   We are using one additional tab for continuation indent, instead of 2.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "PENDING",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151894618",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c48ad371b961435b465aa1fd879d9d08c7d27f81 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/151231245) 
   * 246a3cb4a62b59256ced746abfecfa18d5064745 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/151894618) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389771923
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/GenericJdbcSinkFunction.java
 ##########
 @@ -19,18 +19,18 @@
 package org.apache.flink.api.java.io.jdbc;
 
 import org.apache.flink.api.common.functions.RuntimeContext;
-import org.apache.flink.api.java.tuple.Tuple2;
 import org.apache.flink.configuration.Configuration;
 import org.apache.flink.runtime.state.FunctionInitializationContext;
 import org.apache.flink.runtime.state.FunctionSnapshotContext;
 import org.apache.flink.streaming.api.checkpoint.CheckpointedFunction;
 import org.apache.flink.streaming.api.functions.sink.RichSinkFunction;
-import org.apache.flink.types.Row;
 
-class JDBCUpsertSinkFunction extends RichSinkFunction<Tuple2<Boolean, Row>> implements CheckpointedFunction {
-	private final JDBCUpsertOutputFormat outputFormat;
+import java.io.IOException;
 
-	JDBCUpsertSinkFunction(JDBCUpsertOutputFormat outputFormat) {
+class GenericJdbcSinkFunction<T> extends RichSinkFunction<T> implements CheckpointedFunction {
+	private final AbstractJdbcOutputFormat<T > outputFormat;
+
+	GenericJdbcSinkFunction(AbstractJdbcOutputFormat<T > outputFormat) {
 		this.outputFormat = outputFormat;
 
 Review comment:
   Add a null check?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385216316
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcExecutionOptions.java
 ##########
 @@ -0,0 +1,92 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC sink batch options.
+ */
+@PublicEvolving
+public class JdbcExecutionOptions implements Serializable {
 
 Review comment:
   `JdbcWritingOptions` doesn't seem to match `delete` queries.
   And `JdbcSinkOptions` sounds too generic to me. I'd expect something like JDBC URL inside.
   
   As for `executeQuery`, I don't see why `JdbcExecutionOptions` can't be used for it (batch could mean `limit` then).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385156566
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcConnectionOptions.java
 ##########
 @@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC connection options.
+ */
+@PublicEvolving
+public class JdbcConnectionOptions implements Serializable {
+
+	private static final long serialVersionUID = 1L;
+
+	protected final String url;
+	protected final String driverName;
+	protected final String username;
+	protected final String password;
+
+	public JdbcConnectionOptions(String url, String driverName, String username, String password) {
+		this.url = Preconditions.checkNotNull(url, "jdbc url is empty");
+		this.driverName = Preconditions.checkNotNull(driverName, "driver name is empty");
+		this.username = username;
+		this.password = password;
+	}
+
+	public String getDbURL() {
+		return url;
+	}
+
+	public String getDriverName() {
+		return driverName;
+	}
+
+	public String getUsername() {
+		return username;
+	}
+
+	public String getPassword() {
+		return password;
+	}
+
 
 Review comment:
   This name clashes with `JDBCOptions#builder` which is part of current public API.
   I didn't want to have similar methods named differently so I decided just to create builder with `new`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:PENDING URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:PENDING URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "CANCELED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "CANCELED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "FAILURE",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148531545) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088) 
   * b465793e72972b78ba0bb87921ac001fec959bb3 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/149292262) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383179831
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
+						try {
+							flush();
+						} catch (Exception e) {
+							flushException = e;
+						}
+					}
+				}
+			}, executionOptions.getBatchIntervalMs(), executionOptions.getBatchIntervalMs(), TimeUnit.MILLISECONDS);
+		}
+	}
+
+	private void checkFlushException() {
+		if (flushException != null) {
+			throw new RuntimeException("Writing records to JDBC failed.", flushException);
+		}
+	}
+
+	@Override
+	public final synchronized void writeRecord(In record) {
+		checkFlushException();
+
+		try {
+			doWriteRecord(record);
+			batchCount++;
+			if (batchCount >= executionOptions.getBatchSize()) {
+				flush();
+			}
+		} catch (Exception e) {
+			throw new RuntimeException("Writing records to JDBC failed.", e);
+		}
+	}
+
+	void doWriteRecord(In record) throws SQLException {
+		jdbcStatementExecutor.process(jdbcRecordExtractor.apply(record));
+	}
+
+	@Override
+	public synchronized void flush() throws IOException {
+		checkFlushException();
+
+		for (int i = 1; i <= executionOptions.getMaxRetries(); i++) {
+			try {
+				attemptFlush();
 
 Review comment:
   Here shouldn't we restart the timer? It seems like we may be in the situation where we flushed this batch because we reached the max size, so for the new one we have to set a new timer (renew the `BatchIntervalMs`), right?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389780755
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/JdbcBatchStatementExecutor.java
 ##########
 @@ -16,29 +16,27 @@
  * limitations under the License.
  */
 
-package org.apache.flink.api.java.io.jdbc.writer;
+package org.apache.flink.api.java.io.jdbc.executor;
 
-import org.apache.flink.api.java.tuple.Tuple2;
-import org.apache.flink.types.Row;
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
 
-import java.io.Serializable;
 import java.sql.Connection;
 import java.sql.SQLException;
+import java.util.function.Function;
 
 /**
- * JDBCWriter used to execute statements (e.g. INSERT, UPSERT, DELETE).
+ * Executes the given JDBC statement in batch for the accumulated records.
  */
-public interface JDBCWriter extends Serializable {
+@Internal
+public interface JdbcBatchStatementExecutor<T> {
 
 Review comment:
   Here all the methods are expected to only throw `SQLException` and then, in many cases, we wrap the exceptions in other types. Does it make sense to simply have these methods throw an `Exception`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:UNKNOWN URL:TBD TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389773503
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,316 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.JdbcExecutionOptions.JdbcExecutionOptionsBuilder;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.setRecordToStatement;
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface StatementExecutorFactory<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final StatementExecutorFactory<JdbcExec> statementExecutorFactory;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
 
 Review comment:
   What about making it private and adding a package private or protected getter. This is a question of personal taste though so feel free to ignore if you disagree :)

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] aljoscha commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
aljoscha commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r382586616
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcStatementBuilder.java
 ##########
 @@ -0,0 +1,46 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.function.BiConsumerWithException;
+
+import java.io.Serializable;
+import java.sql.PreparedStatement;
+import java.sql.SQLException;
+
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.setRecordToStatement;
+
+/**
+ * Sets {@link PreparedStatement} parameters to use in JDBC Sink based on a specific type of StreamRecord.
+ * @param <T> type of payload in {@link org.apache.flink.streaming.runtime.streamrecord.StreamRecord StreamRecord}
+ * @see JdbcBatchStatementExecutor
+ */
+@PublicEvolving
+public interface JdbcStatementBuilder<T> extends BiConsumerWithException<PreparedStatement, T, SQLException>, Serializable {
+
+	/**
+	 * Creates a {@link JdbcStatementBuilder} for {@link Row} using the provided SQL types array.
+	 * Uses {@link org.apache.flink.api.java.io.jdbc.JDBCUtils#setRecordToStatement}
+	 */
+	static JdbcStatementBuilder<Row> forRow(int[] types) {
 
 Review comment:
   I think in the long run we don't want to have `Row` in non-Table packages. We therefore shouldn't further expose it in our public APIs.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148531545 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148531545) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "PENDING",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b465793e72972b78ba0bb87921ac001fec959bb3 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/149292262) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250) 
   * cf15b1eae72eb65acbb1d3c145034b7005fac134 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/150879144) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] aljoscha commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
aljoscha commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r380582271
 
 

 ##########
 File path: docs/redirects/jdbc.md
 ##########
 @@ -0,0 +1,24 @@
+---
 
 Review comment:
   Why are these redirects added?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148531545 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:SUCCESS URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088 TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   Hash:b465793e72972b78ba0bb87921ac001fec959bb3 Status:UNKNOWN URL:TBD TriggerType:PUSH TriggerID:b465793e72972b78ba0bb87921ac001fec959bb3
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148531545) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088) 
   * b465793e72972b78ba0bb87921ac001fec959bb3 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383726306
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcExecutionOptions.java
 ##########
 @@ -0,0 +1,92 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC sink batch options.
+ */
+@PublicEvolving
+public class JdbcExecutionOptions implements Serializable {
+	static final int DEFAULT_MAX_RETRY_TIMES = 3;
+	private static final int DEFAULT_INTERVAL_MILLIS = 0;
+	private static final int DEFAULT_SIZE = 5000;
+
+	private final long batchIntervalMs;
+	private final int batchSize;
+	private final int maxRetries;
+
+	private JdbcExecutionOptions(long batchIntervalMs, int batchSize, int maxRetries) {
+		Preconditions.checkArgument(maxRetries >= 1);
+		this.batchIntervalMs = batchIntervalMs;
+		this.batchSize = batchSize;
+		this.maxRetries = maxRetries;
+	}
+
+	public long getBatchIntervalMs() {
+		return batchIntervalMs;
+	}
+
+	public int getBatchSize() {
+		return batchSize;
+	}
+
+	public int getMaxRetries() {
+		return maxRetries;
+	}
+
+	public static JdbcBatchOptionsBuilder builder() {
+		return new JdbcBatchOptionsBuilder();
+	}
+
+	public static JdbcExecutionOptions defaults() {
+		return builder().build();
+	}
+
+	/**
+	 * JDBCBatchOptionsBuilder.
+	 */
+	public static final class JdbcBatchOptionsBuilder {
 
 Review comment:
   `JdbcExecutionOptionsBuilder`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r380642339
 
 

 ##########
 File path: docs/redirects/jdbc.md
 ##########
 @@ -0,0 +1,24 @@
+---
 
 Review comment:
   Just for uniformity with other pages.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:PENDING URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "FAILURE",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "PENDING",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/151226586) 
   * c48ad371b961435b465aa1fd879d9d08c7d27f81 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/151231245) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385155172
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcConnectionOptions.java
 ##########
 @@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC connection options.
+ */
+@PublicEvolving
+public class JdbcConnectionOptions implements Serializable {
+
+	private static final long serialVersionUID = 1L;
+
+	protected final String url;
+	protected final String driverName;
+	protected final String username;
+	protected final String password;
+
+	public JdbcConnectionOptions(String url, String driverName, String username, String password) {
+		this.url = Preconditions.checkNotNull(url, "jdbc url is empty");
+		this.driverName = Preconditions.checkNotNull(driverName, "driver name is empty");
+		this.username = username;
+		this.password = password;
+	}
+
+	public String getDbURL() {
+		return url;
+	}
+
+	public String getDriverName() {
+		return driverName;
+	}
+
+	public String getUsername() {
+		return username;
+	}
+
+	public String getPassword() {
+		return password;
+	}
+
 
 Review comment:
   This name clashes with `JDBCOptions#builder` which is part of current public API.
   I didn't want to have similar methods named differently so I decided just to create builder with `new`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383739398
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcExecutionOptions.java
 ##########
 @@ -0,0 +1,92 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC sink batch options.
+ */
+@PublicEvolving
+public class JdbcExecutionOptions implements Serializable {
 
 Review comment:
   It seems that this option is only used for batch writing. `Execution` sounds like it also contains reading/query, because `PreparedStatement` also exposes a `executeQuery` method. 
   Shall we rename it to `JdbcWritingOptions` or `JdbcSinkOptions`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584613911
 
 
   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit 06ba58a8738f28535d4858b5a5efc863cda620f9 (Tue Feb 11 12:34:54 UTC 2020)
   
   **Warnings:**
    * **This pull request references an unassigned [Jira ticket](https://issues.apache.org/jira/browse/FLINK-15782).** According to the [code contribution guide](https://flink.apache.org/contributing/contribute-code.html), tickets need to be assigned before starting with the implementation work.
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385486500
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -46,15 +47,15 @@
 		}
 	}
 
-	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	interface StatementExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
 
 Review comment:
   What about `StatementExecutorFactory

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r385485796
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcExecutionOptions.java
 ##########
 @@ -0,0 +1,92 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC sink batch options.
+ */
+@PublicEvolving
+public class JdbcExecutionOptions implements Serializable {
 
 Review comment:
   I don't think batch could mean `limit`. From my understanding, `JdbcExecutionOptions` is just a tuning options which should affect the final result. And the `batchIntervalMs` doesn't make sense for readers. 
   But if we dont' have a better name for it, I'm also fine with `JdbcExecutionOptions`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "FAILURE",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "PENDING",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * cf15b1eae72eb65acbb1d3c145034b7005fac134 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/150879144) 
   * 7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/151226586) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383682636
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcConnectionOptions.java
 ##########
 @@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.util.Preconditions;
+
+import java.io.Serializable;
+
+/**
+ * JDBC connection options.
+ */
+@PublicEvolving
+public class JdbcConnectionOptions implements Serializable {
+
+	private static final long serialVersionUID = 1L;
+
+	protected final String url;
+	protected final String driverName;
+	protected final String username;
+	protected final String password;
+
+	public JdbcConnectionOptions(String url, String driverName, String username, String password) {
+		this.url = Preconditions.checkNotNull(url, "jdbc url is empty");
+		this.driverName = Preconditions.checkNotNull(driverName, "driver name is empty");
+		this.username = username;
+		this.password = password;
+	}
+
+	public String getDbURL() {
+		return url;
+	}
+
+	public String getDriverName() {
+		return driverName;
+	}
+
+	public String getUsername() {
+		return username;
+	}
+
+	public String getPassword() {
+		return password;
+	}
+
 
 Review comment:
   Add a `builder` static method to `JdbcConnectionOptions`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389772764
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,316 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.JdbcExecutionOptions.JdbcExecutionOptionsBuilder;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.setRecordToStatement;
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface StatementExecutorFactory<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final StatementExecutorFactory<JdbcExec> statementExecutorFactory;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			StatementExecutorFactory<JdbcExec> statementExecutorFactory,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
 
 Review comment:
   I would also add null checks here wherever needed.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148370417 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:06ba58a8738f28535d4858b5a5efc863cda620f9 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055 TriggerType:PUSH TriggerID:06ba58a8738f28535d4858b5a5efc863cda620f9
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:CANCELED URL:https://travis-ci.com/flink-ci/flink/builds/148375156 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:40cc737da998c1e3a696d3c2714e2c76c5f34b8d Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058 TriggerType:PUSH TriggerID:40cc737da998c1e3a696d3c2714e2c76c5f34b8d
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:457f7fdcabace52c48926f116b3878a1cec0f550 Status:FAILURE URL:https://travis-ci.com/flink-ci/flink/builds/148380612 TriggerType:PUSH TriggerID:457f7fdcabace52c48926f116b3878a1cec0f550
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:SUCCESS URL:https://travis-ci.com/flink-ci/flink/builds/148445223 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:84593f60a55ba420204cb263172be5e75463bd8b Status:FAILURE URL:https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072 TriggerType:PUSH TriggerID:84593f60a55ba420204cb263172be5e75463bd8b
   Hash:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 Status:UNKNOWN URL:TBD TriggerType:PUSH TriggerID:e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127
   -->
   ## CI report:
   
   * 06ba58a8738f28535d4858b5a5efc863cda620f9 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148370417) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055) 
   * 40cc737da998c1e3a696d3c2714e2c76c5f34b8d Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/148375156) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058) 
   * 457f7fdcabace52c48926f116b3878a1cec0f550 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/148380612) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061) 
   * 84593f60a55ba420204cb263172be5e75463bd8b Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/148445223) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072) 
   * e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383176655
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
 
 Review comment:
   Maybe introduce a method like: 
   
   ```
   private JdbcExec initStatementExecutor() throws IOException {
   		final JdbcExec exec = statementRunnerCreator.apply(getRuntimeContext());
   		try {
   			jdbcStatementExecutor.open(connection);
   		} catch (SQLException e) {
   			throw new IOException("unable to open JDBC writer", e);
   		}
   		return exec;
   	}
   ```
   
   To offload a bit of the code in the `open()`. What do you think?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389941896
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,316 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.JdbcExecutionOptions.JdbcExecutionOptionsBuilder;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.api.java.io.jdbc.JDBCUtils.setRecordToStatement;
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface StatementExecutorFactory<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final StatementExecutorFactory<JdbcExec> statementExecutorFactory;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
 
 Review comment:
   Hmm...I think we can have a different "template method" - `addToBatch` instead of `doWriteRecord`; so this field is accessed only in the base class and can be `private`:
   ```
   @Override
   void addToBatch(Tuple2<Boolean, Row> original, Row extracted) throws SQLException {
   ...
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151226586",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151231245",
       "triggerID" : "c48ad371b961435b465aa1fd879d9d08c7d27f81",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     }, {
       "hash" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "status" : "SUCCESS",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/151894618",
       "triggerID" : "246a3cb4a62b59256ced746abfecfa18d5064745",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 246a3cb4a62b59256ced746abfecfa18d5064745 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/151894618) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5948) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383178588
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
 
 Review comment:
   Are we sure about the concurrency here? Here `closed` is guarded by a lock but all the remaining usages are not. Here we are simply reading the value which is `volatile`, but in the `close()` we have a `if (!closed) { closed = true;  ... ` which is not atomic and then we `cancel` the `scheduledFuture` executing this code.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r389944217
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcDmlOptions.java
 ##########
 @@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.java.io.jdbc.dialect.JDBCDialect;
+import org.apache.flink.util.Preconditions;
+
+import javax.annotation.Nullable;
+
+import java.util.Optional;
+import java.util.stream.Stream;
+
+/**
+ * JDBC sink DML options.
+ */
+public class JdbcDmlOptions extends JdbcTypedQueryOptions {
+
+	private static final long serialVersionUID = 1L;
+
+	private final String[] fieldNames;
+	@Nullable
+	private final String[] keyFields;
+	private final String tableName;
+	private final JDBCDialect dialect;
+
+	public static JdbcDmlOptionsBuilder builder() {
+		return new JdbcDmlOptionsBuilder();
+	}
+
+	private JdbcDmlOptions(String tableName, JDBCDialect dialect, String[] fieldNames, int[] fieldTypes, String[] keyFields) {
+		super(fieldTypes);
+		this.tableName = Preconditions.checkNotNull(tableName, "table is empty");
+		this.dialect = Preconditions.checkNotNull(dialect, "dialect name is empty");
+		this.fieldNames = Preconditions.checkNotNull(fieldNames, "field names is empty");
+		this.keyFields = keyFields;
+	}
+
+	public String getTableName() {
+		return tableName;
+	}
+
+	public JDBCDialect getDialect() {
+		return dialect;
+	}
+
+	public String[] getFieldNames() {
+		return fieldNames;
+	}
+
+	public Optional<String[]> getKeyFields() {
+		return Optional.ofNullable(keyFields);
+	}
+
+	/**
+	 * JDBCUpsertOptionsBuilder.
 
 Review comment:
   OK :)

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383712618
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcBatchingOutputFormat.java
 ##########
 @@ -0,0 +1,298 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.common.functions.RuntimeContext;
+import org.apache.flink.api.java.io.jdbc.executor.JdbcBatchStatementExecutor;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.runtime.util.ExecutorThreadFactory;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Preconditions;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.io.Serializable;
+import java.sql.SQLException;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import static org.apache.flink.util.Preconditions.checkNotNull;
+
+class JdbcBatchingOutputFormat<In, JdbcIn, JdbcExec extends JdbcBatchStatementExecutor<JdbcIn>> extends AbstractJdbcOutputFormat<In> {
+	interface RecordExtractor<F, T> extends Function<F, T>, Serializable {
+		static <T> RecordExtractor<T, T> identity() {
+			return x -> x;
+		}
+	}
+
+	interface ExecutorCreator<T extends JdbcBatchStatementExecutor<?>> extends Function<RuntimeContext, T>, Serializable {
+	}
+
+	private static final long serialVersionUID = 1L;
+
+	private static final Logger LOG = LoggerFactory.getLogger(JdbcBatchingOutputFormat.class);
+
+	private final JdbcExecutionOptions executionOptions;
+	private final ExecutorCreator<JdbcExec> statementRunnerCreator;
+	final RecordExtractor<In, JdbcIn> jdbcRecordExtractor;
+
+	private transient JdbcExec jdbcStatementExecutor;
+	private transient int batchCount = 0;
+	private transient volatile boolean closed = false;
+
+	private transient ScheduledExecutorService scheduler;
+	private transient ScheduledFuture<?> scheduledFuture;
+	private transient volatile Exception flushException;
+
+	JdbcBatchingOutputFormat(
+			JdbcConnectionProvider connectionProvider,
+			JdbcExecutionOptions executionOptions,
+			ExecutorCreator<JdbcExec> statementExecutorCreator,
+			RecordExtractor<In, JdbcIn> recordExtractor) {
+		super(connectionProvider);
+		this.executionOptions = executionOptions;
+		this.statementRunnerCreator = statementExecutorCreator;
+		this.jdbcRecordExtractor = recordExtractor;
+	}
+
+	/**
+	 * Connects to the target database and initializes the prepared statement.
+	 *
+	 * @param taskNumber The number of the parallel instance.
+	 */
+	@Override
+	public void open(int taskNumber, int numTasks) throws IOException {
+		super.open(taskNumber, numTasks);
+		jdbcStatementExecutor = statementRunnerCreator.apply(getRuntimeContext());
+		try {
+			jdbcStatementExecutor.open(connection);
+		} catch (SQLException e) {
+			throw new IOException("unable to open JDBC writer", e);
+		}
+		if (executionOptions.getBatchIntervalMs() != 0 && executionOptions.getBatchSize() != 1) {
+			this.scheduler = Executors.newScheduledThreadPool(1, new ExecutorThreadFactory("jdbc-upsert-output-format"));
+			this.scheduledFuture = this.scheduler.scheduleWithFixedDelay(() -> {
+				synchronized (JdbcBatchingOutputFormat.this) {
+					if (!closed) {
+						try {
+							flush();
+						} catch (Exception e) {
+							flushException = e;
+						}
+					}
+				}
+			}, executionOptions.getBatchIntervalMs(), executionOptions.getBatchIntervalMs(), TimeUnit.MILLISECONDS);
+		}
+	}
+
+	private void checkFlushException() {
+		if (flushException != null) {
+			throw new RuntimeException("Writing records to JDBC failed.", flushException);
+		}
+	}
+
+	@Override
+	public final synchronized void writeRecord(In record) {
+		checkFlushException();
+
+		try {
+			doWriteRecord(record);
+			batchCount++;
+			if (batchCount >= executionOptions.getBatchSize()) {
+				flush();
+			}
+		} catch (Exception e) {
+			throw new RuntimeException("Writing records to JDBC failed.", e);
+		}
+	}
+
+	void doWriteRecord(In record) throws SQLException {
+		jdbcStatementExecutor.process(jdbcRecordExtractor.apply(record));
+	}
+
+	@Override
+	public synchronized void flush() throws IOException {
+		checkFlushException();
+
+		for (int i = 1; i <= executionOptions.getMaxRetries(); i++) {
+			try {
+				attemptFlush();
+				batchCount = 0;
+				break;
+			} catch (SQLException e) {
+				LOG.error("JDBC executeBatch error, retry times = {}", i, e);
+				if (i >= executionOptions.getMaxRetries()) {
+					throw new IOException(e);
+				}
+				try {
+					Thread.sleep(1000 * i);
+				} catch (InterruptedException ex) {
+					Thread.currentThread().interrupt();
+					throw new IOException("unable to flush; interrupted while doing another attempt", e);
+				}
+			}
+		}
+	}
+
+	void attemptFlush() throws SQLException {
+		jdbcStatementExecutor.executeBatch();
+	}
+
+	/**
+	 * Executes prepared statement and closes all resources of this instance.
+	 *
+	 */
+	@Override
+	public synchronized void close() {
+		if (!closed) {
+			closed = true;
+
+			checkFlushException();
+
+			if (this.scheduledFuture != null) {
+				scheduledFuture.cancel(false);
+				this.scheduler.shutdown();
+			}
+
+			if (batchCount > 0) {
+				try {
+					flush();
+				} catch (Exception e) {
+					throw new RuntimeException("Writing records to JDBC failed.", e);
+				}
+			}
+
+			try {
+				jdbcStatementExecutor.close();
+			} catch (SQLException e) {
+				LOG.warn("Close JDBC writer failed.", e);
+			}
+		}
+		super.close();
+	}
+
+	public static Builder builder() {
+		return new Builder();
+	}
+
+	/**
+	 * Builder for a {@link JdbcBatchingOutputFormat}.
+	 */
+	public static class Builder {
+		private JDBCOptions options;
+		private String[] fieldNames;
+		private String[] keyFields;
+		private int[] fieldTypes;
+		private int flushMaxSize = DEFAULT_FLUSH_MAX_SIZE;
 
 Review comment:
   Can we use `JdbcExecutionOptions#Builder` here? It's hard and error-prone to maintain the default values in different places.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383691006
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JdbcDmlOptions.java
 ##########
 @@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc;
+
+import org.apache.flink.api.java.io.jdbc.dialect.JDBCDialect;
+import org.apache.flink.util.Preconditions;
+
+import javax.annotation.Nullable;
+
+import java.util.stream.Stream;
+
+/**
+ * JDBC sink DML options.
+ */
+public class JdbcDmlOptions extends JdbcTypedQueryOptions {
+
+	private static final long serialVersionUID = 1L;
+
+	private final String[] fieldNames;
+	@Nullable
+	private final String[] keyFields;
+	private final String tableName;
+	private final JDBCDialect dialect;
+
+	public static JdbcDmlOptionsBuilder builder() {
+		return new JdbcDmlOptionsBuilder();
+	}
+
+	private JdbcDmlOptions(String tableName, JDBCDialect dialect, String[] fieldNames, int[] fieldTypes, String[] keyFields) {
+		super(fieldTypes);
+		this.tableName = Preconditions.checkNotNull(tableName, "table is empty");
+		this.dialect = Preconditions.checkNotNull(dialect, "dialect name is empty");
+		this.fieldNames = Preconditions.checkNotNull(fieldNames, "field names is empty");
+		this.keyFields = keyFields;
+	}
+
+	public String getTableName() {
+		return tableName;
+	}
+
+	public JDBCDialect getDialect() {
+		Preconditions.checkNotNull(dialect, "dialect not set");
+		return dialect;
+	}
+
+	public String[] getFieldNames() {
+		return fieldNames;
+	}
+
+	public String[] getKeyFields() {
+		return keyFields;
 
 Review comment:
   Should we return an `Optional<String[]>` for this? Because keyFields is nullable. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
kl0u commented on a change in pull request #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#discussion_r383181045
 
 

 ##########
 File path: flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/executor/KeyedBatchStatementExecutor.java
 ##########
 @@ -0,0 +1,79 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.java.io.jdbc.executor;
+
+import org.apache.flink.api.java.io.jdbc.JdbcStatementBuilder;
+
+import java.sql.Connection;
+import java.sql.PreparedStatement;
+import java.sql.SQLException;
+import java.util.HashSet;
+import java.util.Set;
+import java.util.function.Function;
+
 
 Review comment:
   Here a javadoc for the types and the functionality would help. I noticed that none of the classes have one, so it may be not part of this PR, but it would be a helpful addition I think.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-584620778
 
 
   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148370417",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5055",
       "triggerID" : "06ba58a8738f28535d4858b5a5efc863cda620f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148375156",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5058",
       "triggerID" : "40cc737da998c1e3a696d3c2714e2c76c5f34b8d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5061",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148380612",
       "triggerID" : "457f7fdcabace52c48926f116b3878a1cec0f550",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148445223",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5072",
       "triggerID" : "84593f60a55ba420204cb263172be5e75463bd8b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/148531545",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5088",
       "triggerID" : "e6697a6c2cf078d2b7bc4118ab0c43fe4f3c3127",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/149292262",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=5250",
       "triggerID" : "b465793e72972b78ba0bb87921ac001fec959bb3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "status" : "FAILURE",
       "url" : "https://travis-ci.com/flink-ci/flink/builds/150879144",
       "triggerID" : "cf15b1eae72eb65acbb1d3c145034b7005fac134",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * cf15b1eae72eb65acbb1d3c145034b7005fac134 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/150879144) 
   * 7eb48481d1ba9fb8a3ec4a6a165ba188be1b300d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [flink] rkhachatryan commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API

Posted by GitBox <gi...@apache.org>.
rkhachatryan commented on issue #11061: [FLINK-15782] [connectors/jdbc] JDBC sink DataStream API
URL: https://github.com/apache/flink/pull/11061#issuecomment-593101446
 
 
   Please review the changes (I've added them as a separate commit).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services