You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@doris.apache.org by GitBox <gi...@apache.org> on 2021/07/17 00:18:48 UTC

[GitHub] [incubator-doris] huzk8 opened a new pull request #6256: [feature]:support spark connector sink data to doris

huzk8 opened a new pull request #6256:
URL: https://github.com/apache/incubator-doris/pull/6256


   ## Proposed changes
   
   support spark conncetor write dataframe to doris
   
   ## Types of changes
   
   What types of changes does your code introduce to Doris?
   _Put an `x` in the boxes that apply_
   
   - [ ] Bugfix (non-breaking change which fixes an issue)
   - [x] New feature (non-breaking change which adds functionality)
   - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
   - [ ] Documentation Update (if none of the other choices apply)
   - [ ] Code refactor (Modify the code structure, format the code, etc...)
   - [ ] Optimization. Including functional usability improvements and performance improvements.
   - [ ] Dependency. Such as changes related to third-party components.
   - [ ] Other.
   
   ## Checklist
   
   _Put an `x` in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your code._
   
   - [ ] I have created an issue on (Fix #ISSUE) and described the bug/feature there in detail
   - [ ] Compiling and unit tests pass locally with my changes
   - [ ] I have added tests that prove my fix is effective or that my feature works
   - [x] If these changes need document changes, I have updated the document
   - [ ] Any dependent changes have been merged
   
   ## Further comments
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org
For additional commands, e-mail: commits-help@doris.apache.org


[GitHub] [incubator-doris] hf200012 commented on a change in pull request #6256: [feature]:support spark connector sink data to doris

Posted by GitBox <gi...@apache.org>.
hf200012 commented on a change in pull request #6256:
URL: https://github.com/apache/incubator-doris/pull/6256#discussion_r683117899



##########
File path: extension/spark-doris-connector/src/main/java/org/apache/doris/spark/DorisStreamLoad.java
##########
@@ -0,0 +1,179 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+package org.apache.doris.spark;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.doris.spark.exception.StreamLoadException;
+import org.apache.doris.spark.rest.models.RespContent;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.BufferedOutputStream;
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.Serializable;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Calendar;
+import java.util.List;
+import java.util.UUID;
+
+/**
+ * DorisStreamLoad
+ **/
+public class DorisStreamLoad implements Serializable{
+
+    private static final Logger LOG = LoggerFactory.getLogger(DorisStreamLoad.class);
+
+    private final static List<String> DORIS_SUCCESS_STATUS = new ArrayList<>(Arrays.asList("Success", "Publish Timeout"));
+    private static String loadUrlPattern = "http://%s/api/%s/%s/_stream_load?";
+    private String user;
+    private String passwd;
+    private String loadUrlStr;
+    private String hostPort;
+    private String db;
+    private String tbl;
+    private String authEncoding;
+
+    public DorisStreamLoad(String hostPort, String db, String tbl, String user, String passwd) {
+        this.hostPort = hostPort;
+        this.db = db;
+        this.tbl = tbl;
+        this.user = user;
+        this.passwd = passwd;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, db, tbl);
+        this.authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+    }
+
+    public String getLoadUrlStr() {
+        return loadUrlStr;
+    }
+
+    public String getHostPort() {
+        return hostPort;
+    }
+
+    public void setHostPort(String hostPort) {
+        this.hostPort = hostPort;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, this.db, this.tbl);
+    }
+
+
+    private HttpURLConnection getConnection(String urlStr, String label) throws IOException {
+        URL url = new URL(urlStr);
+        HttpURLConnection conn = (HttpURLConnection) url.openConnection();
+        conn.setInstanceFollowRedirects(false);
+        conn.setRequestMethod("PUT");
+        String authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+        conn.setRequestProperty("Authorization", "Basic " + authEncoding);
+        conn.addRequestProperty("Expect", "100-continue");
+        conn.addRequestProperty("Content-Type", "text/plain; charset=UTF-8");
+        conn.addRequestProperty("label", label);
+        conn.setDoOutput(true);
+        conn.setDoInput(true);
+        return conn;
+    }
+
+    public static class LoadResponse {
+        public int status;
+        public String respMsg;
+        public String respContent;
+
+        public LoadResponse(int status, String respMsg, String respContent) {
+            this.status = status;
+            this.respMsg = respMsg;
+            this.respContent = respContent;
+        }
+        @Override
+        public String toString() {
+            StringBuilder sb = new StringBuilder();
+            sb.append("status: ").append(status);
+            sb.append(", resp msg: ").append(respMsg);
+            sb.append(", resp content: ").append(respContent);
+            return sb.toString();
+        }
+    }
+
+    public void load(String value) throws StreamLoadException {
+        LoadResponse loadResponse = loadBatch(value);

Review comment:
       It is recommended to add the maximum number of failed retries ,Avoid failures caused by short-term network jitter

##########
File path: extension/spark-doris-connector/src/main/scala/org/apache/doris/spark/sql/DorisOptions.scala
##########
@@ -0,0 +1,11 @@
+package org.apache.doris.spark.sql
+
+object DorisOptions {
+  val beHostPort="beHostPort"

Review comment:
       The IP address of FE should be configured here, and the list of surviving BE nodes can be obtained through FE, and then communication training or other strategies during Stream load, connect to BE to execute load, and avoid the pressure of FE

##########
File path: extension/spark-doris-connector/src/main/java/org/apache/doris/spark/DorisStreamLoad.java
##########
@@ -0,0 +1,179 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+package org.apache.doris.spark;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.doris.spark.exception.StreamLoadException;
+import org.apache.doris.spark.rest.models.RespContent;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.BufferedOutputStream;
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.Serializable;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Calendar;
+import java.util.List;
+import java.util.UUID;
+
+/**
+ * DorisStreamLoad
+ **/
+public class DorisStreamLoad implements Serializable{
+
+    private static final Logger LOG = LoggerFactory.getLogger(DorisStreamLoad.class);
+
+    private final static List<String> DORIS_SUCCESS_STATUS = new ArrayList<>(Arrays.asList("Success", "Publish Timeout"));
+    private static String loadUrlPattern = "http://%s/api/%s/%s/_stream_load?";
+    private String user;
+    private String passwd;
+    private String loadUrlStr;
+    private String hostPort;
+    private String db;
+    private String tbl;
+    private String authEncoding;
+
+    public DorisStreamLoad(String hostPort, String db, String tbl, String user, String passwd) {
+        this.hostPort = hostPort;
+        this.db = db;
+        this.tbl = tbl;
+        this.user = user;
+        this.passwd = passwd;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, db, tbl);
+        this.authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+    }
+
+    public String getLoadUrlStr() {
+        return loadUrlStr;
+    }
+
+    public String getHostPort() {
+        return hostPort;
+    }
+
+    public void setHostPort(String hostPort) {
+        this.hostPort = hostPort;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, this.db, this.tbl);
+    }
+
+
+    private HttpURLConnection getConnection(String urlStr, String label) throws IOException {
+        URL url = new URL(urlStr);
+        HttpURLConnection conn = (HttpURLConnection) url.openConnection();
+        conn.setInstanceFollowRedirects(false);
+        conn.setRequestMethod("PUT");
+        String authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+        conn.setRequestProperty("Authorization", "Basic " + authEncoding);
+        conn.addRequestProperty("Expect", "100-continue");
+        conn.addRequestProperty("Content-Type", "text/plain; charset=UTF-8");
+        conn.addRequestProperty("label", label);
+        conn.setDoOutput(true);
+        conn.setDoInput(true);
+        return conn;
+    }
+
+    public static class LoadResponse {
+        public int status;
+        public String respMsg;
+        public String respContent;
+
+        public LoadResponse(int status, String respMsg, String respContent) {
+            this.status = status;
+            this.respMsg = respMsg;
+            this.respContent = respContent;
+        }
+        @Override
+        public String toString() {
+            StringBuilder sb = new StringBuilder();
+            sb.append("status: ").append(status);
+            sb.append(", resp msg: ").append(respMsg);
+            sb.append(", resp content: ").append(respContent);
+            return sb.toString();
+        }
+    }
+
+    public void load(String value) throws StreamLoadException {
+        LoadResponse loadResponse = loadBatch(value);
+        LOG.info("Streamload Response:{}",loadResponse);
+        if(loadResponse.status != 200){
+            throw new StreamLoadException("stream load error: " + loadResponse.respContent);
+        }else{
+            ObjectMapper obj = new ObjectMapper();
+            try {
+                RespContent respContent = obj.readValue(loadResponse.respContent, RespContent.class);
+                if(!DORIS_SUCCESS_STATUS.contains(respContent.getStatus())){
+                    throw new StreamLoadException("stream load error: " + respContent.getMessage());
+                }
+            } catch (IOException e) {
+                throw new StreamLoadException(e);
+            }
+        }
+    }
+
+    private LoadResponse loadBatch(String value) {
+        Calendar calendar = Calendar.getInstance();
+        String label = String.format("audit_%s%02d%02d_%02d%02d%02d_%s",

Review comment:
       It is recommended to use spark_connector_ for the label name, which is easy to distinguish




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org
For additional commands, e-mail: commits-help@doris.apache.org


[GitHub] [incubator-doris] morningman merged pull request #6256: [feature]:support spark connector sink data to doris

Posted by GitBox <gi...@apache.org>.
morningman merged pull request #6256:
URL: https://github.com/apache/incubator-doris/pull/6256


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org
For additional commands, e-mail: commits-help@doris.apache.org


[GitHub] [incubator-doris] github-actions[bot] commented on pull request #6256: [feature]:support spark connector sink data to doris

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on pull request #6256:
URL: https://github.com/apache/incubator-doris/pull/6256#issuecomment-899037311






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org
For additional commands, e-mail: commits-help@doris.apache.org


[GitHub] [incubator-doris] JNSimba commented on a change in pull request #6256: [feature]:support spark connector sink data to doris

Posted by GitBox <gi...@apache.org>.
JNSimba commented on a change in pull request #6256:
URL: https://github.com/apache/incubator-doris/pull/6256#discussion_r675323336



##########
File path: extension/spark-doris-connector/src/main/scala/org/apache/doris/spark/sql/DorisSourceProvider.scala
##########
@@ -17,14 +17,77 @@
 
 package org.apache.doris.spark.sql
 
+import org.apache.commons.collections.CollectionUtils
+import org.apache.doris.spark.DorisStreamLoad
+import org.apache.doris.spark.exception.DorisException
 import org.apache.spark.internal.Logging
-import org.apache.spark.sql.SQLContext
-import org.apache.spark.sql.sources.{BaseRelation, DataSourceRegister, RelationProvider}
+import org.apache.spark.sql.{DataFrame, SQLContext, SaveMode}
+import org.apache.spark.sql.sources.{BaseRelation, CreatableRelationProvider, DataSourceRegister, Filter, RelationProvider}
+import org.apache.spark.sql.types.StructType
 
-private[sql] class DorisSourceProvider extends DataSourceRegister with RelationProvider with Logging {
+import scala.collection.mutable.ListBuffer
+import scala.util.Random
+
+private[sql] class DorisSourceProvider extends DataSourceRegister with RelationProvider with CreatableRelationProvider with Logging {
   override def shortName(): String = "doris"
 
   override def createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation = {
     new DorisRelation(sqlContext, Utils.params(parameters, log))
   }
+
+
+  /**
+   * df.save
+   */
+  override def createRelation(sqlContext: SQLContext,
+                              mode: SaveMode, parameters: Map[String, String],
+                              data: DataFrame): BaseRelation = {
+    val beHostPort: String = parameters.getOrElse(DorisOptions.beHostPort, throw new DorisException("beHostPort is empty"))
+
+    val dbName: String = parameters.getOrElse(DorisOptions.dbName, throw new DorisException("dbName is empty"))
+
+    val tbName: String = parameters.getOrElse(DorisOptions.tbName, throw new DorisException("tbName is empty"))
+
+    val user: String = parameters.getOrElse(DorisOptions.user, throw new DorisException("user is empty"))
+
+    val password: String = parameters.getOrElse(DorisOptions.password, throw new DorisException("password is empty"))
+
+    val maxRowCount: Long = parameters.getOrElse(DorisOptions.maxRowCount, "1024").toLong
+
+    val splitHost = beHostPort.split(";")
+    val choosedBeHost = splitHost(Random.nextInt(splitHost.length))
+    val dorisStreamLoader = new DorisStreamLoad(choosedBeHost, dbName, tbName, user, password)
+
+    data.foreachPartition(partition => {
+
+      val buffer = ListBuffer[String]()
+      partition.foreach(row => {
+        val rowString = row.toSeq.mkString("\t")

Review comment:
       Do need to consider the **Null** value ?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org
For additional commands, e-mail: commits-help@doris.apache.org


[GitHub] [incubator-doris] hf200012 commented on a change in pull request #6256: [feature]:support spark connector sink data to doris

Posted by GitBox <gi...@apache.org>.
hf200012 commented on a change in pull request #6256:
URL: https://github.com/apache/incubator-doris/pull/6256#discussion_r683117899



##########
File path: extension/spark-doris-connector/src/main/java/org/apache/doris/spark/DorisStreamLoad.java
##########
@@ -0,0 +1,179 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+package org.apache.doris.spark;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.doris.spark.exception.StreamLoadException;
+import org.apache.doris.spark.rest.models.RespContent;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.BufferedOutputStream;
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.Serializable;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Calendar;
+import java.util.List;
+import java.util.UUID;
+
+/**
+ * DorisStreamLoad
+ **/
+public class DorisStreamLoad implements Serializable{
+
+    private static final Logger LOG = LoggerFactory.getLogger(DorisStreamLoad.class);
+
+    private final static List<String> DORIS_SUCCESS_STATUS = new ArrayList<>(Arrays.asList("Success", "Publish Timeout"));
+    private static String loadUrlPattern = "http://%s/api/%s/%s/_stream_load?";
+    private String user;
+    private String passwd;
+    private String loadUrlStr;
+    private String hostPort;
+    private String db;
+    private String tbl;
+    private String authEncoding;
+
+    public DorisStreamLoad(String hostPort, String db, String tbl, String user, String passwd) {
+        this.hostPort = hostPort;
+        this.db = db;
+        this.tbl = tbl;
+        this.user = user;
+        this.passwd = passwd;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, db, tbl);
+        this.authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+    }
+
+    public String getLoadUrlStr() {
+        return loadUrlStr;
+    }
+
+    public String getHostPort() {
+        return hostPort;
+    }
+
+    public void setHostPort(String hostPort) {
+        this.hostPort = hostPort;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, this.db, this.tbl);
+    }
+
+
+    private HttpURLConnection getConnection(String urlStr, String label) throws IOException {
+        URL url = new URL(urlStr);
+        HttpURLConnection conn = (HttpURLConnection) url.openConnection();
+        conn.setInstanceFollowRedirects(false);
+        conn.setRequestMethod("PUT");
+        String authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+        conn.setRequestProperty("Authorization", "Basic " + authEncoding);
+        conn.addRequestProperty("Expect", "100-continue");
+        conn.addRequestProperty("Content-Type", "text/plain; charset=UTF-8");
+        conn.addRequestProperty("label", label);
+        conn.setDoOutput(true);
+        conn.setDoInput(true);
+        return conn;
+    }
+
+    public static class LoadResponse {
+        public int status;
+        public String respMsg;
+        public String respContent;
+
+        public LoadResponse(int status, String respMsg, String respContent) {
+            this.status = status;
+            this.respMsg = respMsg;
+            this.respContent = respContent;
+        }
+        @Override
+        public String toString() {
+            StringBuilder sb = new StringBuilder();
+            sb.append("status: ").append(status);
+            sb.append(", resp msg: ").append(respMsg);
+            sb.append(", resp content: ").append(respContent);
+            return sb.toString();
+        }
+    }
+
+    public void load(String value) throws StreamLoadException {
+        LoadResponse loadResponse = loadBatch(value);

Review comment:
       It is recommended to add the maximum number of failed retries ,Avoid failures caused by short-term network jitter

##########
File path: extension/spark-doris-connector/src/main/scala/org/apache/doris/spark/sql/DorisOptions.scala
##########
@@ -0,0 +1,11 @@
+package org.apache.doris.spark.sql
+
+object DorisOptions {
+  val beHostPort="beHostPort"

Review comment:
       The IP address of FE should be configured here, and the list of surviving BE nodes can be obtained through FE, and then communication training or other strategies during Stream load, connect to BE to execute load, and avoid the pressure of FE

##########
File path: extension/spark-doris-connector/src/main/java/org/apache/doris/spark/DorisStreamLoad.java
##########
@@ -0,0 +1,179 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+package org.apache.doris.spark;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.doris.spark.exception.StreamLoadException;
+import org.apache.doris.spark.rest.models.RespContent;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.BufferedOutputStream;
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.Serializable;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Calendar;
+import java.util.List;
+import java.util.UUID;
+
+/**
+ * DorisStreamLoad
+ **/
+public class DorisStreamLoad implements Serializable{
+
+    private static final Logger LOG = LoggerFactory.getLogger(DorisStreamLoad.class);
+
+    private final static List<String> DORIS_SUCCESS_STATUS = new ArrayList<>(Arrays.asList("Success", "Publish Timeout"));
+    private static String loadUrlPattern = "http://%s/api/%s/%s/_stream_load?";
+    private String user;
+    private String passwd;
+    private String loadUrlStr;
+    private String hostPort;
+    private String db;
+    private String tbl;
+    private String authEncoding;
+
+    public DorisStreamLoad(String hostPort, String db, String tbl, String user, String passwd) {
+        this.hostPort = hostPort;
+        this.db = db;
+        this.tbl = tbl;
+        this.user = user;
+        this.passwd = passwd;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, db, tbl);
+        this.authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+    }
+
+    public String getLoadUrlStr() {
+        return loadUrlStr;
+    }
+
+    public String getHostPort() {
+        return hostPort;
+    }
+
+    public void setHostPort(String hostPort) {
+        this.hostPort = hostPort;
+        this.loadUrlStr = String.format(loadUrlPattern, hostPort, this.db, this.tbl);
+    }
+
+
+    private HttpURLConnection getConnection(String urlStr, String label) throws IOException {
+        URL url = new URL(urlStr);
+        HttpURLConnection conn = (HttpURLConnection) url.openConnection();
+        conn.setInstanceFollowRedirects(false);
+        conn.setRequestMethod("PUT");
+        String authEncoding = Base64.getEncoder().encodeToString(String.format("%s:%s", user, passwd).getBytes(StandardCharsets.UTF_8));
+        conn.setRequestProperty("Authorization", "Basic " + authEncoding);
+        conn.addRequestProperty("Expect", "100-continue");
+        conn.addRequestProperty("Content-Type", "text/plain; charset=UTF-8");
+        conn.addRequestProperty("label", label);
+        conn.setDoOutput(true);
+        conn.setDoInput(true);
+        return conn;
+    }
+
+    public static class LoadResponse {
+        public int status;
+        public String respMsg;
+        public String respContent;
+
+        public LoadResponse(int status, String respMsg, String respContent) {
+            this.status = status;
+            this.respMsg = respMsg;
+            this.respContent = respContent;
+        }
+        @Override
+        public String toString() {
+            StringBuilder sb = new StringBuilder();
+            sb.append("status: ").append(status);
+            sb.append(", resp msg: ").append(respMsg);
+            sb.append(", resp content: ").append(respContent);
+            return sb.toString();
+        }
+    }
+
+    public void load(String value) throws StreamLoadException {
+        LoadResponse loadResponse = loadBatch(value);
+        LOG.info("Streamload Response:{}",loadResponse);
+        if(loadResponse.status != 200){
+            throw new StreamLoadException("stream load error: " + loadResponse.respContent);
+        }else{
+            ObjectMapper obj = new ObjectMapper();
+            try {
+                RespContent respContent = obj.readValue(loadResponse.respContent, RespContent.class);
+                if(!DORIS_SUCCESS_STATUS.contains(respContent.getStatus())){
+                    throw new StreamLoadException("stream load error: " + respContent.getMessage());
+                }
+            } catch (IOException e) {
+                throw new StreamLoadException(e);
+            }
+        }
+    }
+
+    private LoadResponse loadBatch(String value) {
+        Calendar calendar = Calendar.getInstance();
+        String label = String.format("audit_%s%02d%02d_%02d%02d%02d_%s",

Review comment:
       It is recommended to use spark_connector_ for the label name, which is easy to distinguish




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org
For additional commands, e-mail: commits-help@doris.apache.org