You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2020/06/09 16:38:36 UTC

[GitHub] [beam] sabhyankar opened a new pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

sabhyankar opened a new pull request #11950:
URL: https://github.com/apache/beam/pull/11950


   This PR adds a new PTransform (SplunkIO) that writes events to [Splunk's Http Event Collector](https://dev.splunk.com/enterprise/docs/dataapps/httpeventcollector/). 
   
   R: @pabloem 
   ------------------------
   
   Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
   
    - [X] [**Choose reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and mention them in a comment (`R: @username`).
    - [X] Format the pull request title like `[BEAM-XXX] Fixes bug in ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA issue, if applicable. This will automatically link the pull request to the issue.
    - [X] Update `CHANGES.md` with noteworthy changes.
    - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more tips on [how to make review process smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Java11/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Java11/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/) | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/)
   XLang | --- | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/) | --- | --- | [![Build Status](https://builds.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/)
   
   Pre-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   --- |Java | Python | Go | Website
   --- | --- | --- | --- | ---
   Non-portable | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/)<br>[![Build Status](https://builds.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Go_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Go_Cron/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Website_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Website_Cron/lastCompletedBuild/) 
   Portable | --- | [![Build Status](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/) | --- | ---
   
   See [.test-infra/jenkins/README](https://github.com/apache/beam/blob/master/.test-infra/jenkins/README.md) for trigger phrase, status and link of all Jenkins jobs.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r437640982



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkIO.java
##########
@@ -0,0 +1,359 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.auto.value.AutoValue;
+import java.util.concurrent.ThreadLocalRandom;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.annotations.Experimental.Kind;
+import org.apache.beam.sdk.coders.BigEndianIntegerCoder;
+import org.apache.beam.sdk.coders.KvCoder;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ParDo;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * An unbounded sink for Splunk's Http Event Collector (HEC).
+ *
+ * <p>For more information, see the online documentation at <a
+ * href="https://dev.splunk.com/enterprise/docs/dataapps/httpeventcollector/">Splunk HEC</a>.
+ *
+ * <h3>Writing to Splunk's HEC</h3>
+ *
+ * <p>The {@link SplunkIO} class provides a {@link PTransform} that allows writing {@link
+ * SplunkEvent} messages into a Splunk HEC end point.
+ *
+ * <p>It takes as an input a {@link PCollection PCollection&lt;SplunkEvent&gt;}, where each {@link
+ * SplunkEvent} represents an event to be published to HEC.
+ *
+ * <p>To configure a {@link SplunkIO}, you must provide at a minimum:
+ *
+ * <ul>
+ *   <li>url - HEC endpoint URL.
+ *   <li>token - HEC endpoint token.
+ * </ul>
+ *
+ * <p>The {@link SplunkIO} transform can be customized further by optionally specifying:
+ *
+ * <ul>
+ *   <li>parallelism - Number of parallel requests to the HEC.
+ *   <li>batchCount - Number of events in a single batch.
+ *   <li>disableCertificateValidation - Whether to disable ssl validation (useful for self-signed
+ *       certificates)
+ * </ul>
+ *
+ * <p>This transform will return any non-transient write failures via a {@link PCollection
+ * PCollection&lt;SplunkWriteError&gt;}, where each {@link SplunkWriteError} captures the error that
+ * occurred while attempting to write to HEC. These can be published to a dead-letter sink or
+ * reprocessed.
+ *
+ * <p>For example:
+ *
+ * <pre>{@code
+ * PCollection<SplunkEvent> events = ...;
+ *
+ * PCollection<SplunkWriteError> errors =
+ *         events.apply("WriteToSplunk",
+ *              SplunkIO.writeBuilder()

Review comment:
       Thanks for that input! Switched to a fluent factory pattern for the SplunkIO transform.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-642858867


   retest this please


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-642958601


   thanks @sabhyankar ! this is a great addition. Could you please add it to this catalog: https://github.com/apache/beam/blob/master/website/www/site/data/io_matrix.yaml ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r437606598



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEvent.java
##########
@@ -0,0 +1,159 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.auto.value.AutoValue;
+import com.google.gson.annotations.SerializedName;
+import javax.annotation.Nullable;
+
+/**
+ * A {@link SplunkEvent} describes a single payload sent to Splunk's Http Event Collector (HEC)
+ * endpoint.
+ *
+ * <p>Each object represents a single event and related metadata elements such as:
+ *
+ * <ul>
+ *   <li>time
+ *   <li>host
+ *   <li>source
+ *   <li>sourceType
+ *   <li>index
+ * </ul>
+ */
+@AutoValue
+public abstract class SplunkEvent {

Review comment:
       Thank for the pointer @pabloem 
   
   I have switched from using custom coders to DefaultSchema with AutoValueSchema.
   I noticed that in [AutoValueUtils](https://github.com/apache/beam/blob/c3a2dd89616faea5a2171ae6d8e39a77f6e39422/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/AutoValueUtils.java#L187-L191) that the build method should literally be called 'build'. This required me to change some of my method names.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-642929525


   retest this please


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r438284843



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEventWriter.java
##########
@@ -0,0 +1,395 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.api.client.http.HttpResponse;
+import com.google.api.client.http.HttpResponseException;
+import com.google.auto.value.AutoValue;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.security.KeyManagementException;
+import java.security.KeyStoreException;
+import java.security.NoSuchAlgorithmException;
+import java.util.List;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.metrics.Counter;
+import org.apache.beam.sdk.metrics.Metrics;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.state.BagState;
+import org.apache.beam.sdk.state.StateSpec;
+import org.apache.beam.sdk.state.StateSpecs;
+import org.apache.beam.sdk.state.TimeDomain;
+import org.apache.beam.sdk.state.Timer;
+import org.apache.beam.sdk.state.TimerSpec;
+import org.apache.beam.sdk.state.TimerSpecs;
+import org.apache.beam.sdk.state.ValueState;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
+import org.joda.time.Duration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/** A {@link DoFn} to write {@link SplunkEvent}s to Splunk's HEC endpoint. */
+@AutoValue
+abstract class SplunkEventWriter extends DoFn<KV<Integer, SplunkEvent>, SplunkWriteError> {
+
+  private static final Integer DEFAULT_BATCH_COUNT = 1;
+  private static final Boolean DEFAULT_DISABLE_CERTIFICATE_VALIDATION = false;
+  private static final Logger LOG = LoggerFactory.getLogger(SplunkEventWriter.class);
+  private static final long DEFAULT_FLUSH_DELAY = 2;
+  private static final Counter INPUT_COUNTER =
+      Metrics.counter(SplunkEventWriter.class, "inbound-events");
+  private static final Counter SUCCESS_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-successful-events");
+  private static final Counter FAILED_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-failed-events");
+  private static final String BUFFER_STATE_NAME = "buffer";
+  private static final String COUNT_STATE_NAME = "count";
+  private static final String TIME_ID_NAME = "expiry";
+
+  @StateId(BUFFER_STATE_NAME)
+  private final StateSpec<BagState<SplunkEvent>> buffer = StateSpecs.bag();
+
+  @StateId(COUNT_STATE_NAME)
+  private final StateSpec<ValueState<Long>> count = StateSpecs.value();
+
+  @TimerId(TIME_ID_NAME)
+  private final TimerSpec expirySpec = TimerSpecs.timer(TimeDomain.EVENT_TIME);

Review comment:
       I wonder if this should be processing time, to avoid getting stuck when the watermark slows down?

##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEventWriter.java
##########
@@ -0,0 +1,395 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.api.client.http.HttpResponse;
+import com.google.api.client.http.HttpResponseException;
+import com.google.auto.value.AutoValue;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.security.KeyManagementException;
+import java.security.KeyStoreException;
+import java.security.NoSuchAlgorithmException;
+import java.util.List;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.metrics.Counter;
+import org.apache.beam.sdk.metrics.Metrics;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.state.BagState;
+import org.apache.beam.sdk.state.StateSpec;
+import org.apache.beam.sdk.state.StateSpecs;
+import org.apache.beam.sdk.state.TimeDomain;
+import org.apache.beam.sdk.state.Timer;
+import org.apache.beam.sdk.state.TimerSpec;
+import org.apache.beam.sdk.state.TimerSpecs;
+import org.apache.beam.sdk.state.ValueState;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
+import org.joda.time.Duration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/** A {@link DoFn} to write {@link SplunkEvent}s to Splunk's HEC endpoint. */
+@AutoValue
+abstract class SplunkEventWriter extends DoFn<KV<Integer, SplunkEvent>, SplunkWriteError> {
+
+  private static final Integer DEFAULT_BATCH_COUNT = 1;
+  private static final Boolean DEFAULT_DISABLE_CERTIFICATE_VALIDATION = false;
+  private static final Logger LOG = LoggerFactory.getLogger(SplunkEventWriter.class);
+  private static final long DEFAULT_FLUSH_DELAY = 2;
+  private static final Counter INPUT_COUNTER =
+      Metrics.counter(SplunkEventWriter.class, "inbound-events");
+  private static final Counter SUCCESS_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-successful-events");
+  private static final Counter FAILED_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-failed-events");
+  private static final String BUFFER_STATE_NAME = "buffer";
+  private static final String COUNT_STATE_NAME = "count";
+  private static final String TIME_ID_NAME = "expiry";
+
+  @StateId(BUFFER_STATE_NAME)
+  private final StateSpec<BagState<SplunkEvent>> buffer = StateSpecs.bag();
+
+  @StateId(COUNT_STATE_NAME)
+  private final StateSpec<ValueState<Long>> count = StateSpecs.value();
+
+  @TimerId(TIME_ID_NAME)
+  private final TimerSpec expirySpec = TimerSpecs.timer(TimeDomain.EVENT_TIME);
+
+  private Integer batchCount;
+  private Boolean disableValidation;
+  private HttpEventPublisher publisher;
+
+  private static final Gson GSON =
+      new GsonBuilder().setFieldNamingStrategy(f -> f.getName().toLowerCase()).create();
+
+  /** A builder class for creating a {@link SplunkEventWriter}. */
+  static Builder newBuilder() {
+    return new AutoValue_SplunkEventWriter.Builder();
+  }
+
+  @Nullable
+  abstract ValueProvider<String> url();
+
+  @Nullable
+  abstract ValueProvider<String> token();
+
+  @Nullable
+  abstract ValueProvider<Boolean> disableCertificateValidation();
+
+  @Nullable
+  abstract ValueProvider<Integer> inputBatchCount();
+
+  @Setup
+  public void setup() {
+
+    checkArgument(url().isAccessible(), "url is required for writing events.");
+    checkArgument(token().isAccessible(), "Access token is required for writing events.");
+
+    // Either user supplied or default batchCount.
+    if (batchCount == null) {
+
+      if (inputBatchCount() != null) {
+        batchCount = inputBatchCount().get();
+      }
+
+      batchCount = MoreObjects.firstNonNull(batchCount, DEFAULT_BATCH_COUNT);
+      LOG.info("Batch count set to: {}", batchCount);
+    }
+
+    // Either user supplied or default disableValidation.
+    if (disableValidation == null) {
+
+      if (disableCertificateValidation() != null) {
+        disableValidation = disableCertificateValidation().get();
+      }
+
+      disableValidation =
+          MoreObjects.firstNonNull(disableValidation, DEFAULT_DISABLE_CERTIFICATE_VALIDATION);
+      LOG.info("Disable certificate validation set to: {}", disableValidation);
+    }
+
+    try {
+      HttpEventPublisher.Builder builder =
+          HttpEventPublisher.newBuilder()
+              .withUrl(url().get())
+              .withToken(token().get())
+              .withDisableCertificateValidation(disableValidation);
+
+      publisher = builder.build();
+      LOG.info("Successfully created HttpEventPublisher");
+
+    } catch (NoSuchAlgorithmException
+        | KeyStoreException
+        | KeyManagementException
+        | UnsupportedEncodingException e) {
+      LOG.error("Error creating HttpEventPublisher: {}", e.getMessage());
+      throw new RuntimeException(e);
+    }
+  }
+
+  @ProcessElement
+  public void processElement(
+      @Element KV<Integer, SplunkEvent> input,
+      OutputReceiver<SplunkWriteError> receiver,
+      BoundedWindow window,
+      @StateId(BUFFER_STATE_NAME) BagState<SplunkEvent> bufferState,
+      @StateId(COUNT_STATE_NAME) ValueState<Long> countState,
+      @TimerId(TIME_ID_NAME) Timer timer)
+      throws IOException {
+
+    Long count = MoreObjects.<Long>firstNonNull(countState.read(), 0L);
+    SplunkEvent event = input.getValue();
+    INPUT_COUNTER.inc();
+    bufferState.add(event);
+    count += 1;
+    countState.write(count);
+    timer.offset(Duration.standardSeconds(DEFAULT_FLUSH_DELAY)).setRelative();
+
+    if (count >= batchCount) {
+
+      LOG.info("Flushing batch of {} events", count);
+      flush(receiver, bufferState, countState);
+    }
+  }
+
+  @OnTimer(TIME_ID_NAME)
+  public void onExpiry(
+      OutputReceiver<SplunkWriteError> receiver,
+      @StateId(BUFFER_STATE_NAME) BagState<SplunkEvent> bufferState,
+      @StateId(COUNT_STATE_NAME) ValueState<Long> countState)
+      throws IOException {
+
+    if (MoreObjects.<Long>firstNonNull(countState.read(), 0L) > 0) {
+      LOG.info("Flushing window with {} events", countState.read());
+      flush(receiver, bufferState, countState);
+    }
+  }
+
+  @Teardown
+  public void tearDown() {
+    if (this.publisher != null) {
+      try {
+        this.publisher.close();
+        LOG.info("Successfully closed HttpEventPublisher");
+
+      } catch (IOException e) {
+        LOG.warn("Received exception while closing HttpEventPublisher: {}", e.getMessage());
+      }
+    }
+  }
+
+  /**
+   * Flushes a batch of requests via {@link HttpEventPublisher}.
+   *
+   * @param receiver Receiver to write {@link SplunkWriteError}s to
+   */
+  private void flush(
+      OutputReceiver<SplunkWriteError> receiver,
+      @StateId(BUFFER_STATE_NAME) BagState<SplunkEvent> bufferState,
+      @StateId(COUNT_STATE_NAME) ValueState<Long> countState)

Review comment:
       I guess in this case you don't need the `StateId` annotations, as the state handles are passed when you call this function, and are not filled in by Beam

##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkIO.java
##########
@@ -77,32 +78,48 @@
  *
  * PCollection<SplunkWriteError> errors =
  *         events.apply("WriteToSplunk",
- *              SplunkIO.writeBuilder()
- *                     .withToken(token)
- *                     .withUrl(url)
- *                     .withBatchCount(batchCount)
- *                     .withParallelism(parallelism)
- *                     .withDisableCertificateValidation(true)
- *                     .build());
+ *              SplunkIO.write(url, token)
+ *                  .withBatchCount(batchCount)
+ *                  .withParallelism(parallelism)
+ *                  .withDisableCertificateValidation(true));
  * }</pre>
  */
 @Experimental(Kind.SOURCE_SINK)
 public class SplunkIO {
 
+  /**
+   * Write to Splunk's Http Event Collector (HEC).
+   *
+   * @param url splunk hec url
+   * @param token splunk hec authentication token

Review comment:
       Is it possible to not need an auth token? If so, should the token be added in a factory method instead? Or perhaps with a second constructor?
   
   Up to you. You know this better.

##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/HttpEventPublisher.java
##########
@@ -0,0 +1,346 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.api.client.http.ByteArrayContent;
+import com.google.api.client.http.GenericUrl;
+import com.google.api.client.http.HttpBackOffUnsuccessfulResponseHandler;
+import com.google.api.client.http.HttpBackOffUnsuccessfulResponseHandler.BackOffRequired;
+import com.google.api.client.http.HttpContent;
+import com.google.api.client.http.HttpMediaType;
+import com.google.api.client.http.HttpRequest;
+import com.google.api.client.http.HttpRequestFactory;
+import com.google.api.client.http.HttpResponse;
+import com.google.api.client.http.apache.v2.ApacheHttpTransport;
+import com.google.api.client.util.ExponentialBackOff;
+import com.google.auto.value.AutoValue;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.security.KeyManagementException;
+import java.security.KeyStoreException;
+import java.security.NoSuchAlgorithmException;
+import java.util.List;
+import javax.annotation.Nullable;
+import javax.net.ssl.HostnameVerifier;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.annotations.VisibleForTesting;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Joiner;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
+import org.apache.http.client.config.CookieSpecs;
+import org.apache.http.client.config.RequestConfig;
+import org.apache.http.conn.ssl.DefaultHostnameVerifier;
+import org.apache.http.conn.ssl.NoopHostnameVerifier;
+import org.apache.http.conn.ssl.SSLConnectionSocketFactory;
+import org.apache.http.conn.ssl.TrustStrategy;
+import org.apache.http.impl.client.CloseableHttpClient;
+import org.apache.http.impl.client.HttpClientBuilder;
+import org.apache.http.ssl.SSLContextBuilder;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * A utility class that helps write {@link SplunkEvent} records to Splunk's Http Event Collector
+ * (HEC) endpoint.
+ */
+@AutoValue
+abstract class HttpEventPublisher {
+
+  private static final Logger LOG = LoggerFactory.getLogger(HttpEventPublisher.class);
+
+  private static final int DEFAULT_MAX_CONNECTIONS = 1;
+
+  private static final boolean DEFAULT_DISABLE_CERTIFICATE_VALIDATION = false;
+
+  private static final Gson GSON =
+      new GsonBuilder().setFieldNamingStrategy(f -> f.getName().toLowerCase()).create();
+
+  @VisibleForTesting static final String HEC_URL_PATH = "services/collector/event";
+
+  private static final HttpMediaType MEDIA_TYPE =
+      new HttpMediaType("application/json;profile=urn:splunk:event:1.0;charset=utf-8");
+
+  private static final String CONTENT_TYPE =
+      Joiner.on('/').join(MEDIA_TYPE.getType(), MEDIA_TYPE.getSubType());
+
+  private static final String AUTHORIZATION_SCHEME = "Splunk %s";
+
+  private static final String HTTPS_PROTOCOL_PREFIX = "https";
+
+  /** Provides a builder for creating a {@link HttpEventPublisher}. */
+  static Builder newBuilder() {
+    return new AutoValue_HttpEventPublisher.Builder();
+  }
+
+  abstract ApacheHttpTransport transport();
+
+  abstract HttpRequestFactory requestFactory();
+
+  abstract GenericUrl genericUrl();
+
+  abstract String token();
+
+  @Nullable
+  abstract Integer maxElapsedMillis();
+
+  abstract Boolean disableCertificateValidation();
+
+  /**
+   * Executes a POST for the list of {@link SplunkEvent} objects into Splunk's Http Event Collector
+   * endpoint.
+   *
+   * @param events list of {@link SplunkEvent}s
+   * @return {@link HttpResponse} for the POST
+   */
+  HttpResponse execute(List<SplunkEvent> events) throws IOException {
+
+    HttpContent content = getContent(events);
+    HttpRequest request = requestFactory().buildPostRequest(genericUrl(), content);
+
+    HttpBackOffUnsuccessfulResponseHandler responseHandler =
+        new HttpBackOffUnsuccessfulResponseHandler(getConfiguredBackOff());
+
+    responseHandler.setBackOffRequired(BackOffRequired.ON_SERVER_ERROR);
+
+    request.setUnsuccessfulResponseHandler(responseHandler);
+    setHeaders(request, token());
+
+    return request.execute();
+  }
+
+  /**
+   * Same as {@link HttpEventPublisher#execute(List)} but with a single {@link SplunkEvent}.
+   *
+   * @param event {@link SplunkEvent} object
+   */
+  HttpResponse execute(SplunkEvent event) throws IOException {
+    return this.execute(ImmutableList.of(event));
+  }
+
+  /**
+   * Returns an {@link ExponentialBackOff} with the right settings.
+   *
+   * @return {@link ExponentialBackOff} object
+   */
+  @VisibleForTesting
+  ExponentialBackOff getConfiguredBackOff() {
+    return new ExponentialBackOff.Builder().setMaxElapsedTimeMillis(maxElapsedMillis()).build();
+  }
+
+  /** Shutsdown connection manager and releases all resources. */
+  void close() throws IOException {
+    if (transport() != null) {
+      LOG.info("Closing publisher transport.");
+      transport().shutdown();
+    }
+  }
+
+  /**
+   * Utility method to set Authorization and other relevant http headers into the {@link
+   * HttpRequest}.
+   *
+   * @param request {@link HttpRequest} object to add headers to
+   * @param token Splunk's HEC authorization token
+   */
+  private void setHeaders(HttpRequest request, String token) {
+    request.getHeaders().setAuthorization(String.format(AUTHORIZATION_SCHEME, token));
+  }
+
+  /**
+   * Marshals a list of {@link SplunkEvent}s into an {@link HttpContent} object that can be used to
+   * create an {@link HttpRequest}.
+   *
+   * @param events list of {@link SplunkEvent}s
+   * @return {@link HttpContent} that can be used to create an {@link HttpRequest}.
+   */
+  @VisibleForTesting
+  HttpContent getContent(List<SplunkEvent> events) {
+    String payload = getStringPayload(events);
+    LOG.debug("Payload content: {}", payload);
+    return ByteArrayContent.fromString(CONTENT_TYPE, payload);
+  }
+
+  /** Extracts the payload string from a list of {@link SplunkEvent}s. */
+  @VisibleForTesting
+  String getStringPayload(List<SplunkEvent> events) {
+    StringBuilder sb = new StringBuilder();
+    events.forEach(event -> sb.append(GSON.toJson(event)));

Review comment:
       Does this add newline characters or some other way to separate JSON objects? Is this not necessary?

##########
File path: sdks/java/io/splunk/build.gradle
##########
@@ -0,0 +1,39 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * License); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an AS IS BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+plugins {
+    id 'org.apache.beam.module'
+}
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.splunk')
+
+description = "Apache Beam :: SDKs :: Java :: IO :: Splunk"
+ext.summary = "IO to write events to Splunk Http Event Collector (HEC)"
+
+dependencies {
+    compile library.java.slf4j_api
+    compile project(path: ":sdks:java:core", configuration: "shadow")
+    compile group: "com.google.code.gson", name: "gson", version: "2.8.6"
+    compile group: "com.google.api-client", name: "google-api-client", version: "1.30.9"
+    compile group: "com.google.http-client", name: "google-http-client-apache-v2", version: "1.31.0"

Review comment:
       Note google-api-client is part of Beam's standard deps
   https://github.com/apache/beam/blob/master/buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy#L461
   
   Can you add all of the compile dependencies to `BeamModulePlugin.groovy` and get them from there?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r438777187



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/HttpEventPublisher.java
##########
@@ -0,0 +1,346 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.api.client.http.ByteArrayContent;
+import com.google.api.client.http.GenericUrl;
+import com.google.api.client.http.HttpBackOffUnsuccessfulResponseHandler;
+import com.google.api.client.http.HttpBackOffUnsuccessfulResponseHandler.BackOffRequired;
+import com.google.api.client.http.HttpContent;
+import com.google.api.client.http.HttpMediaType;
+import com.google.api.client.http.HttpRequest;
+import com.google.api.client.http.HttpRequestFactory;
+import com.google.api.client.http.HttpResponse;
+import com.google.api.client.http.apache.v2.ApacheHttpTransport;
+import com.google.api.client.util.ExponentialBackOff;
+import com.google.auto.value.AutoValue;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.security.KeyManagementException;
+import java.security.KeyStoreException;
+import java.security.NoSuchAlgorithmException;
+import java.util.List;
+import javax.annotation.Nullable;
+import javax.net.ssl.HostnameVerifier;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.annotations.VisibleForTesting;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Joiner;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
+import org.apache.http.client.config.CookieSpecs;
+import org.apache.http.client.config.RequestConfig;
+import org.apache.http.conn.ssl.DefaultHostnameVerifier;
+import org.apache.http.conn.ssl.NoopHostnameVerifier;
+import org.apache.http.conn.ssl.SSLConnectionSocketFactory;
+import org.apache.http.conn.ssl.TrustStrategy;
+import org.apache.http.impl.client.CloseableHttpClient;
+import org.apache.http.impl.client.HttpClientBuilder;
+import org.apache.http.ssl.SSLContextBuilder;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * A utility class that helps write {@link SplunkEvent} records to Splunk's Http Event Collector
+ * (HEC) endpoint.
+ */
+@AutoValue
+abstract class HttpEventPublisher {
+
+  private static final Logger LOG = LoggerFactory.getLogger(HttpEventPublisher.class);
+
+  private static final int DEFAULT_MAX_CONNECTIONS = 1;
+
+  private static final boolean DEFAULT_DISABLE_CERTIFICATE_VALIDATION = false;
+
+  private static final Gson GSON =
+      new GsonBuilder().setFieldNamingStrategy(f -> f.getName().toLowerCase()).create();
+
+  @VisibleForTesting static final String HEC_URL_PATH = "services/collector/event";
+
+  private static final HttpMediaType MEDIA_TYPE =
+      new HttpMediaType("application/json;profile=urn:splunk:event:1.0;charset=utf-8");
+
+  private static final String CONTENT_TYPE =
+      Joiner.on('/').join(MEDIA_TYPE.getType(), MEDIA_TYPE.getSubType());
+
+  private static final String AUTHORIZATION_SCHEME = "Splunk %s";
+
+  private static final String HTTPS_PROTOCOL_PREFIX = "https";
+
+  /** Provides a builder for creating a {@link HttpEventPublisher}. */
+  static Builder newBuilder() {
+    return new AutoValue_HttpEventPublisher.Builder();
+  }
+
+  abstract ApacheHttpTransport transport();
+
+  abstract HttpRequestFactory requestFactory();
+
+  abstract GenericUrl genericUrl();
+
+  abstract String token();
+
+  @Nullable
+  abstract Integer maxElapsedMillis();
+
+  abstract Boolean disableCertificateValidation();
+
+  /**
+   * Executes a POST for the list of {@link SplunkEvent} objects into Splunk's Http Event Collector
+   * endpoint.
+   *
+   * @param events list of {@link SplunkEvent}s
+   * @return {@link HttpResponse} for the POST
+   */
+  HttpResponse execute(List<SplunkEvent> events) throws IOException {
+
+    HttpContent content = getContent(events);
+    HttpRequest request = requestFactory().buildPostRequest(genericUrl(), content);
+
+    HttpBackOffUnsuccessfulResponseHandler responseHandler =
+        new HttpBackOffUnsuccessfulResponseHandler(getConfiguredBackOff());
+
+    responseHandler.setBackOffRequired(BackOffRequired.ON_SERVER_ERROR);
+
+    request.setUnsuccessfulResponseHandler(responseHandler);
+    setHeaders(request, token());
+
+    return request.execute();
+  }
+
+  /**
+   * Same as {@link HttpEventPublisher#execute(List)} but with a single {@link SplunkEvent}.
+   *
+   * @param event {@link SplunkEvent} object
+   */
+  HttpResponse execute(SplunkEvent event) throws IOException {
+    return this.execute(ImmutableList.of(event));
+  }
+
+  /**
+   * Returns an {@link ExponentialBackOff} with the right settings.
+   *
+   * @return {@link ExponentialBackOff} object
+   */
+  @VisibleForTesting
+  ExponentialBackOff getConfiguredBackOff() {
+    return new ExponentialBackOff.Builder().setMaxElapsedTimeMillis(maxElapsedMillis()).build();
+  }
+
+  /** Shutsdown connection manager and releases all resources. */
+  void close() throws IOException {
+    if (transport() != null) {
+      LOG.info("Closing publisher transport.");
+      transport().shutdown();
+    }
+  }
+
+  /**
+   * Utility method to set Authorization and other relevant http headers into the {@link
+   * HttpRequest}.
+   *
+   * @param request {@link HttpRequest} object to add headers to
+   * @param token Splunk's HEC authorization token
+   */
+  private void setHeaders(HttpRequest request, String token) {
+    request.getHeaders().setAuthorization(String.format(AUTHORIZATION_SCHEME, token));
+  }
+
+  /**
+   * Marshals a list of {@link SplunkEvent}s into an {@link HttpContent} object that can be used to
+   * create an {@link HttpRequest}.
+   *
+   * @param events list of {@link SplunkEvent}s
+   * @return {@link HttpContent} that can be used to create an {@link HttpRequest}.
+   */
+  @VisibleForTesting
+  HttpContent getContent(List<SplunkEvent> events) {
+    String payload = getStringPayload(events);
+    LOG.debug("Payload content: {}", payload);
+    return ByteArrayContent.fromString(CONTENT_TYPE, payload);
+  }
+
+  /** Extracts the payload string from a list of {@link SplunkEvent}s. */
+  @VisibleForTesting
+  String getStringPayload(List<SplunkEvent> events) {
+    StringBuilder sb = new StringBuilder();
+    events.forEach(event -> sb.append(GSON.toJson(event)));

Review comment:
       Its not necessary to have newlines or separators as the batch protocol for Splunk's HEC is [simple event objects stacked one after the other and not necessarily in a JSON array.](https://docs.splunk.com/Documentation/Splunk/8.0.4/Data/FormateventsforHTTPEventCollector#Examples)




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r438829615



##########
File path: sdks/java/io/splunk/build.gradle
##########
@@ -0,0 +1,39 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * License); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an AS IS BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+plugins {
+    id 'org.apache.beam.module'
+}
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.splunk')
+
+description = "Apache Beam :: SDKs :: Java :: IO :: Splunk"
+ext.summary = "IO to write events to Splunk Http Event Collector (HEC)"
+
+dependencies {
+    compile library.java.slf4j_api
+    compile project(path: ":sdks:java:core", configuration: "shadow")
+    compile group: "com.google.code.gson", name: "gson", version: "2.8.6"
+    compile group: "com.google.api-client", name: "google-api-client", version: "1.30.9"
+    compile group: "com.google.http-client", name: "google-http-client-apache-v2", version: "1.31.0"

Review comment:
       Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r436992485



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEvent.java
##########
@@ -0,0 +1,159 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.auto.value.AutoValue;
+import com.google.gson.annotations.SerializedName;
+import javax.annotation.Nullable;
+
+/**
+ * A {@link SplunkEvent} describes a single payload sent to Splunk's Http Event Collector (HEC)
+ * endpoint.
+ *
+ * <p>Each object represents a single event and related metadata elements such as:
+ *
+ * <ul>
+ *   <li>time
+ *   <li>host
+ *   <li>source
+ *   <li>sourceType
+ *   <li>index
+ * </ul>
+ */
+@AutoValue
+public abstract class SplunkEvent {

Review comment:
       I recommend you use `@DefaultSchema(AutoValueSchema.class)` for this class instead of writing a custom coder.

##########
File path: sdks/java/io/splunk/build.gradle
##########
@@ -0,0 +1,39 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * License); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an AS IS BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+plugins {
+    id 'org.apache.beam.module'
+}
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.splunk')
+
+description = "Apache Beam :: SDKs :: Java :: IO :: Splunk"
+ext.summary = "IO to write events to Splunk Http Event Collector (HEC)"
+
+dependencies {
+    compile library.java.slf4j_api
+    compile project(path: ":sdks:java:core", configuration: "shadow")
+    compile group: "com.google.code.gson", name: "gson", version: "2.8.6"
+    compile group: "com.google.api-client", name: "google-api-client", version: "1.30.9"
+    compile group: "com.google.http-client", name: "google-http-client-apache-v2", version: "1.31.0"

Review comment:
       TODO(pablo/sameer) - perhaps add these to the project dependencies. Ensure they don't give extra trouble.

##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkIO.java
##########
@@ -0,0 +1,359 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.auto.value.AutoValue;
+import java.util.concurrent.ThreadLocalRandom;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.annotations.Experimental.Kind;
+import org.apache.beam.sdk.coders.BigEndianIntegerCoder;
+import org.apache.beam.sdk.coders.KvCoder;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ParDo;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * An unbounded sink for Splunk's Http Event Collector (HEC).
+ *
+ * <p>For more information, see the online documentation at <a
+ * href="https://dev.splunk.com/enterprise/docs/dataapps/httpeventcollector/">Splunk HEC</a>.
+ *
+ * <h3>Writing to Splunk's HEC</h3>
+ *
+ * <p>The {@link SplunkIO} class provides a {@link PTransform} that allows writing {@link
+ * SplunkEvent} messages into a Splunk HEC end point.
+ *
+ * <p>It takes as an input a {@link PCollection PCollection&lt;SplunkEvent&gt;}, where each {@link
+ * SplunkEvent} represents an event to be published to HEC.
+ *
+ * <p>To configure a {@link SplunkIO}, you must provide at a minimum:
+ *
+ * <ul>
+ *   <li>url - HEC endpoint URL.
+ *   <li>token - HEC endpoint token.
+ * </ul>
+ *
+ * <p>The {@link SplunkIO} transform can be customized further by optionally specifying:
+ *
+ * <ul>
+ *   <li>parallelism - Number of parallel requests to the HEC.
+ *   <li>batchCount - Number of events in a single batch.
+ *   <li>disableCertificateValidation - Whether to disable ssl validation (useful for self-signed
+ *       certificates)
+ * </ul>
+ *
+ * <p>This transform will return any non-transient write failures via a {@link PCollection
+ * PCollection&lt;SplunkWriteError&gt;}, where each {@link SplunkWriteError} captures the error that
+ * occurred while attempting to write to HEC. These can be published to a dead-letter sink or
+ * reprocessed.
+ *
+ * <p>For example:
+ *
+ * <pre>{@code
+ * PCollection<SplunkEvent> events = ...;
+ *
+ * PCollection<SplunkWriteError> errors =
+ *         events.apply("WriteToSplunk",
+ *              SplunkIO.writeBuilder()

Review comment:
       In Beam, these PTransform builders are not really used as such. Do you think it makes sense to have a fluent, factory-type class that doesn't require a builder?
   e.g.
   
   ```
    *                 SplunkIO.write()
    *                     .withToken(token)
    *                     .withUrl(url)
    *                     .withBatchCount(batchCount)
    *                     .withParallelism(parallelism)
    *                     .withDisableCertificateValidation(true)
   ```

##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEventWriter.java
##########
@@ -0,0 +1,395 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.api.client.http.HttpResponse;
+import com.google.api.client.http.HttpResponseException;
+import com.google.auto.value.AutoValue;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.security.KeyManagementException;
+import java.security.KeyStoreException;
+import java.security.NoSuchAlgorithmException;
+import java.util.List;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.metrics.Counter;
+import org.apache.beam.sdk.metrics.Metrics;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.state.BagState;
+import org.apache.beam.sdk.state.StateSpec;
+import org.apache.beam.sdk.state.StateSpecs;
+import org.apache.beam.sdk.state.TimeDomain;
+import org.apache.beam.sdk.state.Timer;
+import org.apache.beam.sdk.state.TimerSpec;
+import org.apache.beam.sdk.state.TimerSpecs;
+import org.apache.beam.sdk.state.ValueState;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
+import org.joda.time.Duration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/** A {@link DoFn} to write {@link SplunkEvent}s to Splunk's HEC endpoint. */
+@AutoValue
+abstract class SplunkEventWriter extends DoFn<KV<Integer, SplunkEvent>, SplunkWriteError> {
+
+  private static final Integer DEFAULT_BATCH_COUNT = 1;
+  private static final Boolean DEFAULT_DISABLE_CERTIFICATE_VALIDATION = false;
+  private static final Logger LOG = LoggerFactory.getLogger(SplunkEventWriter.class);
+  private static final long DEFAULT_FLUSH_DELAY = 2;
+  private static final Counter INPUT_COUNTER =
+      Metrics.counter(SplunkEventWriter.class, "inbound-events");
+  private static final Counter SUCCESS_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-successful-events");
+  private static final Counter FAILED_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-failed-events");
+  private static final String BUFFER_STATE_NAME = "buffer";
+  private static final String COUNT_STATE_NAME = "count";
+  private static final String TIME_ID_NAME = "expiry";
+
+  @StateId(BUFFER_STATE_NAME)
+  private final StateSpec<BagState<SplunkEvent>> buffer = StateSpecs.bag();
+
+  @StateId(COUNT_STATE_NAME)
+  private final StateSpec<ValueState<Long>> count = StateSpecs.value();
+
+  @TimerId(TIME_ID_NAME)
+  private final TimerSpec expirySpec = TimerSpecs.timer(TimeDomain.EVENT_TIME);
+
+  private Integer batchCount;
+  private Boolean disableValidation;
+  private HttpEventPublisher publisher;
+
+  private static final Gson GSON =
+      new GsonBuilder().setFieldNamingStrategy(f -> f.getName().toLowerCase()).create();
+
+  /** A builder class for creating a {@link SplunkEventWriter}. */
+  static Builder newBuilder() {
+    return new AutoValue_SplunkEventWriter.Builder();
+  }
+
+  @Nullable
+  abstract ValueProvider<String> url();
+
+  @Nullable
+  abstract ValueProvider<String> token();
+
+  @Nullable
+  abstract ValueProvider<Boolean> disableCertificateValidation();
+
+  @Nullable
+  abstract ValueProvider<Integer> inputBatchCount();
+
+  @Setup
+  public void setup() {
+
+    checkArgument(url().isAccessible(), "url is required for writing events.");
+    checkArgument(token().isAccessible(), "Access token is required for writing events.");
+
+    // Either user supplied or default batchCount.
+    if (batchCount == null) {
+
+      if (inputBatchCount() != null) {
+        batchCount = inputBatchCount().get();
+      }
+
+      batchCount = MoreObjects.firstNonNull(batchCount, DEFAULT_BATCH_COUNT);
+      LOG.info("Batch count set to: {}", batchCount);
+    }
+
+    // Either user supplied or default disableValidation.
+    if (disableValidation == null) {
+
+      if (disableCertificateValidation() != null) {
+        disableValidation = disableCertificateValidation().get();
+      }
+
+      disableValidation =
+          MoreObjects.firstNonNull(disableValidation, DEFAULT_DISABLE_CERTIFICATE_VALIDATION);
+      LOG.info("Disable certificate validation set to: {}", disableValidation);
+    }
+
+    try {
+      HttpEventPublisher.Builder builder =

Review comment:
       TODO(pablo) - review http event publisher

##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEvent.java
##########
@@ -0,0 +1,159 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.auto.value.AutoValue;
+import com.google.gson.annotations.SerializedName;
+import javax.annotation.Nullable;
+
+/**
+ * A {@link SplunkEvent} describes a single payload sent to Splunk's Http Event Collector (HEC)
+ * endpoint.
+ *
+ * <p>Each object represents a single event and related metadata elements such as:
+ *
+ * <ul>
+ *   <li>time
+ *   <li>host
+ *   <li>source
+ *   <li>sourceType
+ *   <li>index
+ * </ul>
+ */
+@AutoValue
+public abstract class SplunkEvent {

Review comment:
       see the *AutoValue* section of https://beam.apache.org/documentation/programming-guide/#creating-schemas - there's also a way to provide a Builder annotation

##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkIO.java
##########
@@ -0,0 +1,359 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.auto.value.AutoValue;
+import java.util.concurrent.ThreadLocalRandom;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.annotations.Experimental;
+import org.apache.beam.sdk.annotations.Experimental.Kind;
+import org.apache.beam.sdk.coders.BigEndianIntegerCoder;
+import org.apache.beam.sdk.coders.KvCoder;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.PTransform;
+import org.apache.beam.sdk.transforms.ParDo;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.sdk.values.PCollection;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * An unbounded sink for Splunk's Http Event Collector (HEC).
+ *
+ * <p>For more information, see the online documentation at <a
+ * href="https://dev.splunk.com/enterprise/docs/dataapps/httpeventcollector/">Splunk HEC</a>.
+ *
+ * <h3>Writing to Splunk's HEC</h3>
+ *
+ * <p>The {@link SplunkIO} class provides a {@link PTransform} that allows writing {@link
+ * SplunkEvent} messages into a Splunk HEC end point.
+ *
+ * <p>It takes as an input a {@link PCollection PCollection&lt;SplunkEvent&gt;}, where each {@link
+ * SplunkEvent} represents an event to be published to HEC.
+ *
+ * <p>To configure a {@link SplunkIO}, you must provide at a minimum:
+ *
+ * <ul>
+ *   <li>url - HEC endpoint URL.
+ *   <li>token - HEC endpoint token.
+ * </ul>
+ *
+ * <p>The {@link SplunkIO} transform can be customized further by optionally specifying:
+ *
+ * <ul>
+ *   <li>parallelism - Number of parallel requests to the HEC.
+ *   <li>batchCount - Number of events in a single batch.
+ *   <li>disableCertificateValidation - Whether to disable ssl validation (useful for self-signed
+ *       certificates)
+ * </ul>
+ *
+ * <p>This transform will return any non-transient write failures via a {@link PCollection
+ * PCollection&lt;SplunkWriteError&gt;}, where each {@link SplunkWriteError} captures the error that
+ * occurred while attempting to write to HEC. These can be published to a dead-letter sink or
+ * reprocessed.
+ *
+ * <p>For example:
+ *
+ * <pre>{@code
+ * PCollection<SplunkEvent> events = ...;
+ *
+ * PCollection<SplunkWriteError> errors =
+ *         events.apply("WriteToSplunk",
+ *              SplunkIO.writeBuilder()

Review comment:
       note the example of PubsubIO, where the builder is used internally as part of the implementation, but externally, the built transform is passed around: https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubIO.java#L931-L965

##########
File path: sdks/java/io/splunk/build.gradle
##########
@@ -0,0 +1,39 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * License); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an AS IS BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+plugins {
+    id 'org.apache.beam.module'
+}
+applyJavaNature(automaticModuleName: 'org.apache.beam.sdk.io.splunk')
+
+description = "Apache Beam :: SDKs :: Java :: IO :: Splunk"
+ext.summary = "IO to write events to Splunk Http Event Collector (HEC)"
+
+dependencies {
+    compile library.java.slf4j_api
+    compile project(path: ":sdks:java:core", configuration: "shadow")
+    compile group: "com.google.code.gson", name: "gson", version: "2.8.6"
+    compile group: "com.google.api-client", name: "google-api-client", version: "1.30.9"
+    compile group: "com.google.http-client", name: "google-http-client-apache-v2", version: "1.31.0"

Review comment:
       TODO(pablo/sameer) - perhaps add these new dependencies to the project dependencies. Ensure that they are consistent with other com.google client/api dependencies.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-641604419


   thanks. Looking once more...


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-640865650


   R: @pabloem 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem merged pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem merged pull request #11950:
URL: https://github.com/apache/beam/pull/11950


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r438829308



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEventWriter.java
##########
@@ -0,0 +1,395 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.api.client.http.HttpResponse;
+import com.google.api.client.http.HttpResponseException;
+import com.google.auto.value.AutoValue;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.security.KeyManagementException;
+import java.security.KeyStoreException;
+import java.security.NoSuchAlgorithmException;
+import java.util.List;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.metrics.Counter;
+import org.apache.beam.sdk.metrics.Metrics;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.state.BagState;
+import org.apache.beam.sdk.state.StateSpec;
+import org.apache.beam.sdk.state.StateSpecs;
+import org.apache.beam.sdk.state.TimeDomain;
+import org.apache.beam.sdk.state.Timer;
+import org.apache.beam.sdk.state.TimerSpec;
+import org.apache.beam.sdk.state.TimerSpecs;
+import org.apache.beam.sdk.state.ValueState;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
+import org.joda.time.Duration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/** A {@link DoFn} to write {@link SplunkEvent}s to Splunk's HEC endpoint. */
+@AutoValue
+abstract class SplunkEventWriter extends DoFn<KV<Integer, SplunkEvent>, SplunkWriteError> {
+
+  private static final Integer DEFAULT_BATCH_COUNT = 1;
+  private static final Boolean DEFAULT_DISABLE_CERTIFICATE_VALIDATION = false;
+  private static final Logger LOG = LoggerFactory.getLogger(SplunkEventWriter.class);
+  private static final long DEFAULT_FLUSH_DELAY = 2;
+  private static final Counter INPUT_COUNTER =
+      Metrics.counter(SplunkEventWriter.class, "inbound-events");
+  private static final Counter SUCCESS_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-successful-events");
+  private static final Counter FAILED_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-failed-events");
+  private static final String BUFFER_STATE_NAME = "buffer";
+  private static final String COUNT_STATE_NAME = "count";
+  private static final String TIME_ID_NAME = "expiry";
+
+  @StateId(BUFFER_STATE_NAME)
+  private final StateSpec<BagState<SplunkEvent>> buffer = StateSpecs.bag();
+
+  @StateId(COUNT_STATE_NAME)
+  private final StateSpec<ValueState<Long>> count = StateSpecs.value();
+
+  @TimerId(TIME_ID_NAME)
+  private final TimerSpec expirySpec = TimerSpecs.timer(TimeDomain.EVENT_TIME);
+
+  private Integer batchCount;
+  private Boolean disableValidation;
+  private HttpEventPublisher publisher;
+
+  private static final Gson GSON =
+      new GsonBuilder().setFieldNamingStrategy(f -> f.getName().toLowerCase()).create();
+
+  /** A builder class for creating a {@link SplunkEventWriter}. */
+  static Builder newBuilder() {
+    return new AutoValue_SplunkEventWriter.Builder();
+  }
+
+  @Nullable
+  abstract ValueProvider<String> url();
+
+  @Nullable
+  abstract ValueProvider<String> token();
+
+  @Nullable
+  abstract ValueProvider<Boolean> disableCertificateValidation();
+
+  @Nullable
+  abstract ValueProvider<Integer> inputBatchCount();
+
+  @Setup
+  public void setup() {
+
+    checkArgument(url().isAccessible(), "url is required for writing events.");
+    checkArgument(token().isAccessible(), "Access token is required for writing events.");
+
+    // Either user supplied or default batchCount.
+    if (batchCount == null) {
+
+      if (inputBatchCount() != null) {
+        batchCount = inputBatchCount().get();
+      }
+
+      batchCount = MoreObjects.firstNonNull(batchCount, DEFAULT_BATCH_COUNT);
+      LOG.info("Batch count set to: {}", batchCount);
+    }
+
+    // Either user supplied or default disableValidation.
+    if (disableValidation == null) {
+
+      if (disableCertificateValidation() != null) {
+        disableValidation = disableCertificateValidation().get();
+      }
+
+      disableValidation =
+          MoreObjects.firstNonNull(disableValidation, DEFAULT_DISABLE_CERTIFICATE_VALIDATION);
+      LOG.info("Disable certificate validation set to: {}", disableValidation);
+    }
+
+    try {
+      HttpEventPublisher.Builder builder =
+          HttpEventPublisher.newBuilder()
+              .withUrl(url().get())
+              .withToken(token().get())
+              .withDisableCertificateValidation(disableValidation);
+
+      publisher = builder.build();
+      LOG.info("Successfully created HttpEventPublisher");
+
+    } catch (NoSuchAlgorithmException
+        | KeyStoreException
+        | KeyManagementException
+        | UnsupportedEncodingException e) {
+      LOG.error("Error creating HttpEventPublisher: {}", e.getMessage());
+      throw new RuntimeException(e);
+    }
+  }
+
+  @ProcessElement
+  public void processElement(
+      @Element KV<Integer, SplunkEvent> input,
+      OutputReceiver<SplunkWriteError> receiver,
+      BoundedWindow window,
+      @StateId(BUFFER_STATE_NAME) BagState<SplunkEvent> bufferState,
+      @StateId(COUNT_STATE_NAME) ValueState<Long> countState,
+      @TimerId(TIME_ID_NAME) Timer timer)
+      throws IOException {
+
+    Long count = MoreObjects.<Long>firstNonNull(countState.read(), 0L);
+    SplunkEvent event = input.getValue();
+    INPUT_COUNTER.inc();
+    bufferState.add(event);
+    count += 1;
+    countState.write(count);
+    timer.offset(Duration.standardSeconds(DEFAULT_FLUSH_DELAY)).setRelative();
+
+    if (count >= batchCount) {
+
+      LOG.info("Flushing batch of {} events", count);
+      flush(receiver, bufferState, countState);
+    }
+  }
+
+  @OnTimer(TIME_ID_NAME)
+  public void onExpiry(
+      OutputReceiver<SplunkWriteError> receiver,
+      @StateId(BUFFER_STATE_NAME) BagState<SplunkEvent> bufferState,
+      @StateId(COUNT_STATE_NAME) ValueState<Long> countState)
+      throws IOException {
+
+    if (MoreObjects.<Long>firstNonNull(countState.read(), 0L) > 0) {
+      LOG.info("Flushing window with {} events", countState.read());
+      flush(receiver, bufferState, countState);
+    }
+  }
+
+  @Teardown
+  public void tearDown() {
+    if (this.publisher != null) {
+      try {
+        this.publisher.close();
+        LOG.info("Successfully closed HttpEventPublisher");
+
+      } catch (IOException e) {
+        LOG.warn("Received exception while closing HttpEventPublisher: {}", e.getMessage());
+      }
+    }
+  }
+
+  /**
+   * Flushes a batch of requests via {@link HttpEventPublisher}.
+   *
+   * @param receiver Receiver to write {@link SplunkWriteError}s to
+   */
+  private void flush(
+      OutputReceiver<SplunkWriteError> receiver,
+      @StateId(BUFFER_STATE_NAME) BagState<SplunkEvent> bufferState,
+      @StateId(COUNT_STATE_NAME) ValueState<Long> countState)

Review comment:
       Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-642696537


   Thanks @pabloem - Pushed a commit with the remaining changes that were requested.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-642862089


   retest this please


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r438829089



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkEventWriter.java
##########
@@ -0,0 +1,395 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.sdk.io.splunk;
+
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument;
+import static org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkNotNull;
+
+import com.google.api.client.http.HttpResponse;
+import com.google.api.client.http.HttpResponseException;
+import com.google.auto.value.AutoValue;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import java.io.IOException;
+import java.io.UnsupportedEncodingException;
+import java.security.KeyManagementException;
+import java.security.KeyStoreException;
+import java.security.NoSuchAlgorithmException;
+import java.util.List;
+import javax.annotation.Nullable;
+import org.apache.beam.sdk.metrics.Counter;
+import org.apache.beam.sdk.metrics.Metrics;
+import org.apache.beam.sdk.options.ValueProvider;
+import org.apache.beam.sdk.state.BagState;
+import org.apache.beam.sdk.state.StateSpec;
+import org.apache.beam.sdk.state.StateSpecs;
+import org.apache.beam.sdk.state.TimeDomain;
+import org.apache.beam.sdk.state.Timer;
+import org.apache.beam.sdk.state.TimerSpec;
+import org.apache.beam.sdk.state.TimerSpecs;
+import org.apache.beam.sdk.state.ValueState;
+import org.apache.beam.sdk.transforms.DoFn;
+import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
+import org.apache.beam.sdk.values.KV;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.MoreObjects;
+import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.Lists;
+import org.joda.time.Duration;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/** A {@link DoFn} to write {@link SplunkEvent}s to Splunk's HEC endpoint. */
+@AutoValue
+abstract class SplunkEventWriter extends DoFn<KV<Integer, SplunkEvent>, SplunkWriteError> {
+
+  private static final Integer DEFAULT_BATCH_COUNT = 1;
+  private static final Boolean DEFAULT_DISABLE_CERTIFICATE_VALIDATION = false;
+  private static final Logger LOG = LoggerFactory.getLogger(SplunkEventWriter.class);
+  private static final long DEFAULT_FLUSH_DELAY = 2;
+  private static final Counter INPUT_COUNTER =
+      Metrics.counter(SplunkEventWriter.class, "inbound-events");
+  private static final Counter SUCCESS_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-successful-events");
+  private static final Counter FAILED_WRITES =
+      Metrics.counter(SplunkEventWriter.class, "outbound-failed-events");
+  private static final String BUFFER_STATE_NAME = "buffer";
+  private static final String COUNT_STATE_NAME = "count";
+  private static final String TIME_ID_NAME = "expiry";
+
+  @StateId(BUFFER_STATE_NAME)
+  private final StateSpec<BagState<SplunkEvent>> buffer = StateSpecs.bag();
+
+  @StateId(COUNT_STATE_NAME)
+  private final StateSpec<ValueState<Long>> count = StateSpecs.value();
+
+  @TimerId(TIME_ID_NAME)
+  private final TimerSpec expirySpec = TimerSpecs.timer(TimeDomain.EVENT_TIME);

Review comment:
       Good catch! Done!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-640877094


   retest this please


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on a change in pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on a change in pull request #11950:
URL: https://github.com/apache/beam/pull/11950#discussion_r438774319



##########
File path: sdks/java/io/splunk/src/main/java/org/apache/beam/sdk/io/splunk/SplunkIO.java
##########
@@ -77,32 +78,48 @@
  *
  * PCollection<SplunkWriteError> errors =
  *         events.apply("WriteToSplunk",
- *              SplunkIO.writeBuilder()
- *                     .withToken(token)
- *                     .withUrl(url)
- *                     .withBatchCount(batchCount)
- *                     .withParallelism(parallelism)
- *                     .withDisableCertificateValidation(true)
- *                     .build());
+ *              SplunkIO.write(url, token)
+ *                  .withBatchCount(batchCount)
+ *                  .withParallelism(parallelism)
+ *                  .withDisableCertificateValidation(true));
  * }</pre>
  */
 @Experimental(Kind.SOURCE_SINK)
 public class SplunkIO {
 
+  /**
+   * Write to Splunk's Http Event Collector (HEC).
+   *
+   * @param url splunk hec url
+   * @param token splunk hec authentication token

Review comment:
       A token is required in order to use HEC and each end point has at-least one token. So this would be a required param. (See: [Using http event collector](https://docs.splunk.com/Documentation/Splunk/8.0.4/Data/UsetheHTTPEventCollector#Create_an_Event_Collector_token))




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] sabhyankar commented on pull request #11950: [BEAM-8596]: Add SplunkIO transform to write messages to Splunk

Posted by GitBox <gi...@apache.org>.
sabhyankar commented on pull request #11950:
URL: https://github.com/apache/beam/pull/11950#issuecomment-641500443


   @pabloem Thanks for the quick reviews! I have pushed a couple of commits with the mods you requested.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org