You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by da...@apache.org on 2019/12/08 11:14:02 UTC

[camel] branch master updated (6d1e975 -> 33805aa)

This is an automated email from the ASF dual-hosted git repository.

davsclaus pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/camel.git.


    from 6d1e975  camel-jacksonxml: Make pretty-print test platform independent.
     new 807aa89  CAMEL-14263: Fixed compile issue with ganglia and rebuild
     new 33805aa  Regen

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../src/main/docs/ganglia-component.adoc           |   2 +-
 .../component/ganglia/GangliaConfiguration.java    |   2 +-
 .../src/main/docs/nagios-component.adoc            |   3 +-
 .../sjms/batch/SjmsBatchEndpointTest.java          |   2 +-
 .../endpoint/dsl/SparkEndpointBuilderFactory.java  | 347 ++++++---------------
 .../modules/ROOT/pages/ganglia-component.adoc      |   2 +-
 .../modules/ROOT/pages/nagios-component.adoc       |   3 +-
 .../springboot/GangliaComponentConfiguration.java  |   2 +-
 8 files changed, 111 insertions(+), 252 deletions(-)


[camel] 02/02: Regen

Posted by da...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

davsclaus pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/camel.git

commit 33805aa6f5e4183ff8a366215d799d79998bde21
Author: Claus Ibsen <cl...@gmail.com>
AuthorDate: Sun Dec 8 12:13:38 2019 +0100

    Regen
---
 docs/components/modules/ROOT/pages/ganglia-component.adoc | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/components/modules/ROOT/pages/ganglia-component.adoc b/docs/components/modules/ROOT/pages/ganglia-component.adoc
index 9575bbe..2066803 100644
--- a/docs/components/modules/ROOT/pages/ganglia-component.adoc
+++ b/docs/components/modules/ROOT/pages/ganglia-component.adoc
@@ -111,7 +111,7 @@ with the following path and query parameters:
 | *spoofHostname* (producer) | Spoofing information IP:hostname |  | String
 | *tmax* (producer) | Maximum time in seconds that the value can be considered current. After this, Ganglia considers the value to have expired. | 60 | int
 | *ttl* (producer) | If using multicast, set the TTL of the packets | 5 | int
-| *type* (producer) | The type of value | string | GMetricType
+| *type* (producer) | The type of value | STRING | GMetricType
 | *units* (producer) | Any unit of measurement that qualifies the metric, e.g. widgets, litres, bytes. Do not include a prefix such as k (kilo) or m (milli), other tools may scale the units later. The value should be unscaled. |  | String
 | *wireFormat31x* (producer) | Use the wire format of Ganglia 3.1.0 and later versions. Set this to false to use Ganglia 3.0.x or earlier. | true | boolean
 | *basicPropertyBinding* (advanced) | Whether the endpoint should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities | false | boolean


[camel] 01/02: CAMEL-14263: Fixed compile issue with ganglia and rebuild

Posted by da...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

davsclaus pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/camel.git

commit 807aa8962af16d0a72ab74bb122de480f93d9212
Author: Claus Ibsen <cl...@gmail.com>
AuthorDate: Sun Dec 8 10:19:55 2019 +0100

    CAMEL-14263: Fixed compile issue with ganglia and rebuild
---
 .../src/main/docs/ganglia-component.adoc           |   2 +-
 .../component/ganglia/GangliaConfiguration.java    |   2 +-
 .../src/main/docs/nagios-component.adoc            |   3 +-
 .../sjms/batch/SjmsBatchEndpointTest.java          |   2 +-
 .../endpoint/dsl/SparkEndpointBuilderFactory.java  | 347 ++++++---------------
 .../modules/ROOT/pages/nagios-component.adoc       |   3 +-
 .../springboot/GangliaComponentConfiguration.java  |   2 +-
 7 files changed, 110 insertions(+), 251 deletions(-)

diff --git a/components/camel-ganglia/src/main/docs/ganglia-component.adoc b/components/camel-ganglia/src/main/docs/ganglia-component.adoc
index 9fb5270..f4c19c8 100644
--- a/components/camel-ganglia/src/main/docs/ganglia-component.adoc
+++ b/components/camel-ganglia/src/main/docs/ganglia-component.adoc
@@ -110,7 +110,7 @@ with the following path and query parameters:
 | *spoofHostname* (producer) | Spoofing information IP:hostname |  | String
 | *tmax* (producer) | Maximum time in seconds that the value can be considered current. After this, Ganglia considers the value to have expired. | 60 | int
 | *ttl* (producer) | If using multicast, set the TTL of the packets | 5 | int
-| *type* (producer) | The type of value | string | GMetricType
+| *type* (producer) | The type of value | STRING | GMetricType
 | *units* (producer) | Any unit of measurement that qualifies the metric, e.g. widgets, litres, bytes. Do not include a prefix such as k (kilo) or m (milli), other tools may scale the units later. The value should be unscaled. |  | String
 | *wireFormat31x* (producer) | Use the wire format of Ganglia 3.1.0 and later versions. Set this to false to use Ganglia 3.0.x or earlier. | true | boolean
 | *basicPropertyBinding* (advanced) | Whether the endpoint should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities | false | boolean
diff --git a/components/camel-ganglia/src/main/java/org/apache/camel/component/ganglia/GangliaConfiguration.java b/components/camel-ganglia/src/main/java/org/apache/camel/component/ganglia/GangliaConfiguration.java
index acdbf3f..7a99eb3 100644
--- a/components/camel-ganglia/src/main/java/org/apache/camel/component/ganglia/GangliaConfiguration.java
+++ b/components/camel-ganglia/src/main/java/org/apache/camel/component/ganglia/GangliaConfiguration.java
@@ -70,7 +70,7 @@ public class GangliaConfiguration implements Cloneable {
     @UriParam(defaultValue = "metric")
     private String metricName = DEFAULT_METRIC_NAME;
 
-    @UriParam(defaultValue = "string", enums = "string,int8,uint8,int16,uint16,int32,uint32,float,double")
+    @UriParam(defaultValue = "STRING", enums = "STRING,INT8,UINT8,INT16,UINT16,INT32,UINT32,FLOAT,DOUBLE")
     private GMetricType type = DEFAULT_TYPE;
 
     @UriParam(defaultValue = "BOTH", enums = "ZERO,POSITIVE,NEGATIVE,BOTH")
diff --git a/components/camel-nagios/src/main/docs/nagios-component.adoc b/components/camel-nagios/src/main/docs/nagios-component.adoc
index 89bc709..2825dfe 100644
--- a/components/camel-nagios/src/main/docs/nagios-component.adoc
+++ b/components/camel-nagios/src/main/docs/nagios-component.adoc
@@ -115,7 +115,7 @@ When using Spring Boot make sure to use the following Maven dependency to have s
 ----
 
 
-The component supports 12 options, which are listed below.
+The component supports 11 options, which are listed below.
 
 
 
@@ -133,7 +133,6 @@ The component supports 12 options, which are listed below.
 | *camel.component.nagios.configuration.timeout* | Sending timeout in millis. | 5000 | Integer
 | *camel.component.nagios.enabled* | Whether to enable auto configuration of the nagios component. This is enabled by default. |  | Boolean
 | *camel.component.nagios.lazy-start-producer* | Whether the producer should be started lazy (on the first message). By starting lazy you can use this to allow CamelContext and routes to startup in situations where a producer may otherwise fail during starting and cause the route to fail being started. By deferring this startup to be lazy then the startup failure can be handled during routing messages via Camel's routing error handlers. Beware that when the first message is processed the [...]
-| *camel.component.nagios.configuration.encryption-method* | *Deprecated* To specify an encryption method. |  | NagiosEncryptionMethod
 |===
 // spring-boot-auto-configure options: END
 
diff --git a/components/camel-sjms/src/test/java/org/apache/camel/component/sjms/batch/SjmsBatchEndpointTest.java b/components/camel-sjms/src/test/java/org/apache/camel/component/sjms/batch/SjmsBatchEndpointTest.java
index 152f14d..a1fba4e 100644
--- a/components/camel-sjms/src/test/java/org/apache/camel/component/sjms/batch/SjmsBatchEndpointTest.java
+++ b/components/camel-sjms/src/test/java/org/apache/camel/component/sjms/batch/SjmsBatchEndpointTest.java
@@ -78,7 +78,7 @@ public class SjmsBatchEndpointTest extends CamelTestSupport {
         return true;
     }
 
-    @Test(expected = FailedToStartRouteException.class)
+    @Test(expected = FailedToCreateRouteException.class)
     public void testProducerFailure() throws Exception {
         context.addRoutes(new RouteBuilder() {
             public void configure() throws Exception {
diff --git a/core/camel-endpointdsl/src/main/java/org/apache/camel/builder/endpoint/dsl/SparkEndpointBuilderFactory.java b/core/camel-endpointdsl/src/main/java/org/apache/camel/builder/endpoint/dsl/SparkEndpointBuilderFactory.java
index c3eea0f..1c0237d 100644
--- a/core/camel-endpointdsl/src/main/java/org/apache/camel/builder/endpoint/dsl/SparkEndpointBuilderFactory.java
+++ b/core/camel-endpointdsl/src/main/java/org/apache/camel/builder/endpoint/dsl/SparkEndpointBuilderFactory.java
@@ -17,15 +17,13 @@
 package org.apache.camel.builder.endpoint.dsl;
 
 import javax.annotation.Generated;
-import org.apache.camel.ExchangePattern;
 import org.apache.camel.builder.EndpointConsumerBuilder;
 import org.apache.camel.builder.EndpointProducerBuilder;
 import org.apache.camel.builder.endpoint.AbstractEndpointBuilder;
-import org.apache.camel.spi.ExceptionHandler;
 
 /**
- * The spark-rest component is used for hosting REST services which has been
- * defined using Camel rest-dsl.
+ * The spark component can be used to send RDD or DataFrame jobs to Apache Spark
+ * cluster.
  * 
  * Generated by camel-package-maven-plugin - do not edit this file!
  */
@@ -34,262 +32,179 @@ public interface SparkEndpointBuilderFactory {
 
 
     /**
-     * Builder for endpoint for the Spark Rest component.
+     * Builder for endpoint for the Spark component.
      */
-    public interface SparkEndpointBuilder extends EndpointConsumerBuilder {
+    public interface SparkEndpointBuilder extends EndpointProducerBuilder {
         default AdvancedSparkEndpointBuilder advanced() {
             return (AdvancedSparkEndpointBuilder) this;
         }
         /**
-         * Accept type such as: 'text/xml', or 'application/json'. By default we
-         * accept all kinds of types.
-         * 
-         * The option is a: <code>java.lang.String</code> type.
-         * 
-         * Group: consumer
-         */
-        default SparkEndpointBuilder accept(String accept) {
-            doSetProperty("accept", accept);
-            return this;
-        }
-        /**
-         * Allows for bridging the consumer to the Camel routing Error Handler,
-         * which mean any exceptions occurred while the consumer is trying to
-         * pickup incoming messages, or the likes, will now be processed as a
-         * message and handled by the routing Error Handler. By default the
-         * consumer will use the org.apache.camel.spi.ExceptionHandler to deal
-         * with exceptions, that will be logged at WARN or ERROR level and
-         * ignored.
-         * 
-         * The option is a: <code>boolean</code> type.
-         * 
-         * Group: consumer
-         */
-        default SparkEndpointBuilder bridgeErrorHandler(
-                boolean bridgeErrorHandler) {
-            doSetProperty("bridgeErrorHandler", bridgeErrorHandler);
-            return this;
-        }
-        /**
-         * Allows for bridging the consumer to the Camel routing Error Handler,
-         * which mean any exceptions occurred while the consumer is trying to
-         * pickup incoming messages, or the likes, will now be processed as a
-         * message and handled by the routing Error Handler. By default the
-         * consumer will use the org.apache.camel.spi.ExceptionHandler to deal
-         * with exceptions, that will be logged at WARN or ERROR level and
-         * ignored.
-         * 
-         * The option will be converted to a <code>boolean</code> type.
-         * 
-         * Group: consumer
-         */
-        default SparkEndpointBuilder bridgeErrorHandler(
-                String bridgeErrorHandler) {
-            doSetProperty("bridgeErrorHandler", bridgeErrorHandler);
-            return this;
-        }
-        /**
-         * Determines whether or not the raw input stream from Spark
-         * HttpRequest#getContent() is cached or not (Camel will read the stream
-         * into a in light-weight memory based Stream caching) cache. By default
-         * Camel will cache the Netty input stream to support reading it
-         * multiple times to ensure Camel can retrieve all data from the stream.
-         * However you can set this option to true when you for example need to
-         * access the raw stream, such as streaming it directly to a file or
-         * other persistent store. Mind that if you enable this option, then you
-         * cannot read the Netty stream multiple times out of the box, and you
-         * would need manually to reset the reader index on the Spark raw
-         * stream.
+         * Indicates if results should be collected or counted.
          * 
          * The option is a: <code>boolean</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder disableStreamCache(
-                boolean disableStreamCache) {
-            doSetProperty("disableStreamCache", disableStreamCache);
+        default SparkEndpointBuilder collect(boolean collect) {
+            doSetProperty("collect", collect);
             return this;
         }
         /**
-         * Determines whether or not the raw input stream from Spark
-         * HttpRequest#getContent() is cached or not (Camel will read the stream
-         * into a in light-weight memory based Stream caching) cache. By default
-         * Camel will cache the Netty input stream to support reading it
-         * multiple times to ensure Camel can retrieve all data from the stream.
-         * However you can set this option to true when you for example need to
-         * access the raw stream, such as streaming it directly to a file or
-         * other persistent store. Mind that if you enable this option, then you
-         * cannot read the Netty stream multiple times out of the box, and you
-         * would need manually to reset the reader index on the Spark raw
-         * stream.
+         * Indicates if results should be collected or counted.
          * 
          * The option will be converted to a <code>boolean</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder disableStreamCache(
-                String disableStreamCache) {
-            doSetProperty("disableStreamCache", disableStreamCache);
+        default SparkEndpointBuilder collect(String collect) {
+            doSetProperty("collect", collect);
             return this;
         }
         /**
-         * If this option is enabled, then during binding from Spark to Camel
-         * Message then the headers will be mapped as well (eg added as header
-         * to the Camel Message as well). You can turn off this option to
-         * disable this. The headers can still be accessed from the
-         * org.apache.camel.component.sparkrest.SparkMessage message with the
-         * method getRequest() that returns the Spark HTTP request instance.
+         * DataFrame to compute against.
          * 
-         * The option is a: <code>boolean</code> type.
+         * The option is a:
+         * <code>org.apache.spark.sql.Dataset&lt;org.apache.spark.sql.Row&gt;</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder mapHeaders(boolean mapHeaders) {
-            doSetProperty("mapHeaders", mapHeaders);
+        default SparkEndpointBuilder dataFrame(Object dataFrame) {
+            doSetProperty("dataFrame", dataFrame);
             return this;
         }
         /**
-         * If this option is enabled, then during binding from Spark to Camel
-         * Message then the headers will be mapped as well (eg added as header
-         * to the Camel Message as well). You can turn off this option to
-         * disable this. The headers can still be accessed from the
-         * org.apache.camel.component.sparkrest.SparkMessage message with the
-         * method getRequest() that returns the Spark HTTP request instance.
+         * DataFrame to compute against.
          * 
-         * The option will be converted to a <code>boolean</code> type.
+         * The option will be converted to a
+         * <code>org.apache.spark.sql.Dataset&lt;org.apache.spark.sql.Row&gt;</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder mapHeaders(String mapHeaders) {
-            doSetProperty("mapHeaders", mapHeaders);
+        default SparkEndpointBuilder dataFrame(String dataFrame) {
+            doSetProperty("dataFrame", dataFrame);
             return this;
         }
         /**
-         * If enabled and an Exchange failed processing on the consumer side,
-         * and if the caused Exception was send back serialized in the response
-         * as a application/x-java-serialized-object content type. This is by
-         * default turned off. If you enable this then be aware that Java will
-         * deserialize the incoming data from the request to Java and that can
-         * be a potential security risk.
+         * Function performing action against an DataFrame.
          * 
-         * The option is a: <code>boolean</code> type.
+         * The option is a:
+         * <code>org.apache.camel.component.spark.DataFrameCallback</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder transferException(boolean transferException) {
-            doSetProperty("transferException", transferException);
+        default SparkEndpointBuilder dataFrameCallback(Object dataFrameCallback) {
+            doSetProperty("dataFrameCallback", dataFrameCallback);
             return this;
         }
         /**
-         * If enabled and an Exchange failed processing on the consumer side,
-         * and if the caused Exception was send back serialized in the response
-         * as a application/x-java-serialized-object content type. This is by
-         * default turned off. If you enable this then be aware that Java will
-         * deserialize the incoming data from the request to Java and that can
-         * be a potential security risk.
+         * Function performing action against an DataFrame.
          * 
-         * The option will be converted to a <code>boolean</code> type.
+         * The option will be converted to a
+         * <code>org.apache.camel.component.spark.DataFrameCallback</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder transferException(String transferException) {
-            doSetProperty("transferException", transferException);
+        default SparkEndpointBuilder dataFrameCallback(String dataFrameCallback) {
+            doSetProperty("dataFrameCallback", dataFrameCallback);
             return this;
         }
         /**
-         * If this option is enabled, then during binding from Spark to Camel
-         * Message then the header values will be URL decoded (eg %20 will be a
-         * space character.).
+         * Whether the producer should be started lazy (on the first message).
+         * By starting lazy you can use this to allow CamelContext and routes to
+         * startup in situations where a producer may otherwise fail during
+         * starting and cause the route to fail being started. By deferring this
+         * startup to be lazy then the startup failure can be handled during
+         * routing messages via Camel's routing error handlers. Beware that when
+         * the first message is processed then creating and starting the
+         * producer may take a little time and prolong the total processing time
+         * of the processing.
          * 
          * The option is a: <code>boolean</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder urlDecodeHeaders(boolean urlDecodeHeaders) {
-            doSetProperty("urlDecodeHeaders", urlDecodeHeaders);
+        default SparkEndpointBuilder lazyStartProducer(boolean lazyStartProducer) {
+            doSetProperty("lazyStartProducer", lazyStartProducer);
             return this;
         }
         /**
-         * If this option is enabled, then during binding from Spark to Camel
-         * Message then the header values will be URL decoded (eg %20 will be a
-         * space character.).
+         * Whether the producer should be started lazy (on the first message).
+         * By starting lazy you can use this to allow CamelContext and routes to
+         * startup in situations where a producer may otherwise fail during
+         * starting and cause the route to fail being started. By deferring this
+         * startup to be lazy then the startup failure can be handled during
+         * routing messages via Camel's routing error handlers. Beware that when
+         * the first message is processed then creating and starting the
+         * producer may take a little time and prolong the total processing time
+         * of the processing.
          * 
          * The option will be converted to a <code>boolean</code> type.
          * 
-         * Group: consumer
+         * Group: producer
          */
-        default SparkEndpointBuilder urlDecodeHeaders(String urlDecodeHeaders) {
-            doSetProperty("urlDecodeHeaders", urlDecodeHeaders);
+        default SparkEndpointBuilder lazyStartProducer(String lazyStartProducer) {
+            doSetProperty("lazyStartProducer", lazyStartProducer);
             return this;
         }
-    }
-
-    /**
-     * Advanced builder for endpoint for the Spark Rest component.
-     */
-    public interface AdvancedSparkEndpointBuilder
-            extends
-                EndpointConsumerBuilder {
-        default SparkEndpointBuilder basic() {
-            return (SparkEndpointBuilder) this;
-        }
         /**
-         * To let the consumer use a custom ExceptionHandler. Notice if the
-         * option bridgeErrorHandler is enabled then this option is not in use.
-         * By default the consumer will deal with exceptions, that will be
-         * logged at WARN or ERROR level and ignored.
+         * RDD to compute against.
          * 
-         * The option is a: <code>org.apache.camel.spi.ExceptionHandler</code>
+         * The option is a: <code>org.apache.spark.api.java.JavaRDDLike</code>
          * type.
          * 
-         * Group: consumer (advanced)
+         * Group: producer
          */
-        default AdvancedSparkEndpointBuilder exceptionHandler(
-                ExceptionHandler exceptionHandler) {
-            doSetProperty("exceptionHandler", exceptionHandler);
+        default SparkEndpointBuilder rdd(Object rdd) {
+            doSetProperty("rdd", rdd);
             return this;
         }
         /**
-         * To let the consumer use a custom ExceptionHandler. Notice if the
-         * option bridgeErrorHandler is enabled then this option is not in use.
-         * By default the consumer will deal with exceptions, that will be
-         * logged at WARN or ERROR level and ignored.
+         * RDD to compute against.
          * 
          * The option will be converted to a
-         * <code>org.apache.camel.spi.ExceptionHandler</code> type.
+         * <code>org.apache.spark.api.java.JavaRDDLike</code> type.
          * 
-         * Group: consumer (advanced)
+         * Group: producer
          */
-        default AdvancedSparkEndpointBuilder exceptionHandler(
-                String exceptionHandler) {
-            doSetProperty("exceptionHandler", exceptionHandler);
+        default SparkEndpointBuilder rdd(String rdd) {
+            doSetProperty("rdd", rdd);
             return this;
         }
         /**
-         * Sets the exchange pattern when the consumer creates an exchange.
+         * Function performing action against an RDD.
          * 
-         * The option is a: <code>org.apache.camel.ExchangePattern</code> type.
+         * The option is a:
+         * <code>org.apache.camel.component.spark.RddCallback</code> type.
          * 
-         * Group: consumer (advanced)
+         * Group: producer
          */
-        default AdvancedSparkEndpointBuilder exchangePattern(
-                ExchangePattern exchangePattern) {
-            doSetProperty("exchangePattern", exchangePattern);
+        default SparkEndpointBuilder rddCallback(Object rddCallback) {
+            doSetProperty("rddCallback", rddCallback);
             return this;
         }
         /**
-         * Sets the exchange pattern when the consumer creates an exchange.
+         * Function performing action against an RDD.
          * 
          * The option will be converted to a
-         * <code>org.apache.camel.ExchangePattern</code> type.
+         * <code>org.apache.camel.component.spark.RddCallback</code> type.
          * 
-         * Group: consumer (advanced)
+         * Group: producer
          */
-        default AdvancedSparkEndpointBuilder exchangePattern(
-                String exchangePattern) {
-            doSetProperty("exchangePattern", exchangePattern);
+        default SparkEndpointBuilder rddCallback(String rddCallback) {
+            doSetProperty("rddCallback", rddCallback);
             return this;
         }
+    }
+
+    /**
+     * Advanced builder for endpoint for the Spark component.
+     */
+    public interface AdvancedSparkEndpointBuilder
+            extends
+                EndpointProducerBuilder {
+        default SparkEndpointBuilder basic() {
+            return (SparkEndpointBuilder) this;
+        }
         /**
          * Whether the endpoint should use basic property binding (Camel 2.x) or
          * the newer property binding with additional capabilities.
@@ -317,56 +232,6 @@ public interface SparkEndpointBuilderFactory {
             return this;
         }
         /**
-         * Whether or not the consumer should try to find a target consumer by
-         * matching the URI prefix if no exact match is found.
-         * 
-         * The option is a: <code>boolean</code> type.
-         * 
-         * Group: advanced
-         */
-        default AdvancedSparkEndpointBuilder matchOnUriPrefix(
-                boolean matchOnUriPrefix) {
-            doSetProperty("matchOnUriPrefix", matchOnUriPrefix);
-            return this;
-        }
-        /**
-         * Whether or not the consumer should try to find a target consumer by
-         * matching the URI prefix if no exact match is found.
-         * 
-         * The option will be converted to a <code>boolean</code> type.
-         * 
-         * Group: advanced
-         */
-        default AdvancedSparkEndpointBuilder matchOnUriPrefix(
-                String matchOnUriPrefix) {
-            doSetProperty("matchOnUriPrefix", matchOnUriPrefix);
-            return this;
-        }
-        /**
-         * To use a custom SparkBinding to map to/from Camel message.
-         * 
-         * The option is a:
-         * <code>org.apache.camel.component.sparkrest.SparkBinding</code> type.
-         * 
-         * Group: advanced
-         */
-        default AdvancedSparkEndpointBuilder sparkBinding(Object sparkBinding) {
-            doSetProperty("sparkBinding", sparkBinding);
-            return this;
-        }
-        /**
-         * To use a custom SparkBinding to map to/from Camel message.
-         * 
-         * The option will be converted to a
-         * <code>org.apache.camel.component.sparkrest.SparkBinding</code> type.
-         * 
-         * Group: advanced
-         */
-        default AdvancedSparkEndpointBuilder sparkBinding(String sparkBinding) {
-            doSetProperty("sparkBinding", sparkBinding);
-            return this;
-        }
-        /**
          * Sets whether synchronous processing should be strictly used, or Camel
          * is allowed to use asynchronous processing (if supported).
          * 
@@ -392,28 +257,24 @@ public interface SparkEndpointBuilderFactory {
         }
     }
     /**
-     * Spark Rest (camel-spark-rest)
-     * The spark-rest component is used for hosting REST services which has been
-     * defined using Camel rest-dsl.
-     * 
-     * Category: rest
-     * Since: 2.14
-     * Maven coordinates: org.apache.camel:camel-spark-rest
+     * Spark (camel-spark)
+     * The spark component can be used to send RDD or DataFrame jobs to Apache
+     * Spark cluster.
      * 
-     * Syntax: <code>spark-rest:verb:path</code>
+     * Category: bigdata,iot
+     * Since: 2.17
+     * Maven coordinates: org.apache.camel:camel-spark
      * 
-     * Path parameter: verb (required)
-     * get, post, put, patch, delete, head, trace, connect, or options.
-     * The value can be one of: get, post, put, patch, delete, head, trace,
-     * connect, options
+     * Syntax: <code>spark:endpointType</code>
      * 
-     * Path parameter: path (required)
-     * The content path which support Spark syntax.
+     * Path parameter: endpointType (required)
+     * Type of the endpoint (rdd, dataframe, hive).
+     * The value can be one of: rdd, dataframe, hive
      */
-    default SparkEndpointBuilder sparkRest(String path) {
+    default SparkEndpointBuilder spark(String path) {
         class SparkEndpointBuilderImpl extends AbstractEndpointBuilder implements SparkEndpointBuilder, AdvancedSparkEndpointBuilder {
             public SparkEndpointBuilderImpl(String path) {
-                super("spark-rest", path);
+                super("spark", path);
             }
         }
         return new SparkEndpointBuilderImpl(path);
diff --git a/docs/components/modules/ROOT/pages/nagios-component.adoc b/docs/components/modules/ROOT/pages/nagios-component.adoc
index d1ae2e9..4abb483 100644
--- a/docs/components/modules/ROOT/pages/nagios-component.adoc
+++ b/docs/components/modules/ROOT/pages/nagios-component.adoc
@@ -116,7 +116,7 @@ When using Spring Boot make sure to use the following Maven dependency to have s
 ----
 
 
-The component supports 12 options, which are listed below.
+The component supports 11 options, which are listed below.
 
 
 
@@ -134,7 +134,6 @@ The component supports 12 options, which are listed below.
 | *camel.component.nagios.configuration.timeout* | Sending timeout in millis. | 5000 | Integer
 | *camel.component.nagios.enabled* | Whether to enable auto configuration of the nagios component. This is enabled by default. |  | Boolean
 | *camel.component.nagios.lazy-start-producer* | Whether the producer should be started lazy (on the first message). By starting lazy you can use this to allow CamelContext and routes to startup in situations where a producer may otherwise fail during starting and cause the route to fail being started. By deferring this startup to be lazy then the startup failure can be handled during routing messages via Camel's routing error handlers. Beware that when the first message is processed the [...]
-| *camel.component.nagios.configuration.encryption-method* | *Deprecated* To specify an encryption method. |  | NagiosEncryptionMethod
 |===
 // spring-boot-auto-configure options: END
 
diff --git a/platforms/spring-boot/components-starter/camel-ganglia-starter/src/main/java/org/apache/camel/component/ganglia/springboot/GangliaComponentConfiguration.java b/platforms/spring-boot/components-starter/camel-ganglia-starter/src/main/java/org/apache/camel/component/ganglia/springboot/GangliaComponentConfiguration.java
index ce4a199..749b076 100644
--- a/platforms/spring-boot/components-starter/camel-ganglia-starter/src/main/java/org/apache/camel/component/ganglia/springboot/GangliaComponentConfiguration.java
+++ b/platforms/spring-boot/components-starter/camel-ganglia-starter/src/main/java/org/apache/camel/component/ganglia/springboot/GangliaComponentConfiguration.java
@@ -145,7 +145,7 @@ public class GangliaComponentConfiguration
         /**
          * The type of value
          */
-        private GMetricType type = GMetricType.string;
+        private GMetricType type = GMetricType.STRING;
         /**
          * The slope
          */