You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by GitBox <gi...@apache.org> on 2020/09/07 09:59:36 UTC

[GitHub] [camel-kafka-connector] apupier opened a new issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

apupier opened a new issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430


   additionally to Camel main catalog https://github.com/apache/camel/blob/camel-3.5.0/catalog/camel-catalog/src/generated/resources/org/apache/camel/catalog/main/camel-main-configuration-metadata.json , there are specific Camel Kafka Connector configuration properties, for instance `camel.source.*` and `camel.sink.*` (are there others?)
   
   providing a catalog would allow to have completion and hover provided by the Camel Language Server.
   
   nota: a second step could be to provide validation too


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier edited a comment on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier edited a comment on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690058374


   > The camel component/endpoint options are exactly the same of the camel main catalog plus prefixes.
   
   Can you precise which prefixes and points to an example?
   Based on examples I found, it sounds like exactly the same than normal catalog, so writing the same than for Properties file for Camel main, for instance [camel.component.aws-kinesis.configuration.access-key=youraccesskey](https://github.com/apache/camel-kafka-connector/blob/15f7fb1f40f92ed66f1cbbfd62bc6521079afd41/examples/CamelAWSKinesisSourceConnector.properties#L24) doesn't contain anything specific to Camel Kafka Connector.
   
   > what you see in CamelSourceConnectorConfig and CamelSinkConnectorConfig have been defined without taking into account the catalog content.
   
   Do you mean that these configurations should be modified to reuse existing catalog content?
   I thought it was specific Camel Kafka Connector properties, so having a catalog for what is defined inside these 2 classes would help providing completion.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690071753


   > https://camel.apache.org/camel-kafka-connector/latest/connectors/camel-activemq-kafka-sink-connector.html
   > 
   > All the camel.sink.* stuff or camel.source.* stuff are camel catalog options renamed with sink or source in the name
   
   ok, just discovered for other options of camel.sink.* and camel.source.* (apart from camel.sink.url and camel.sink.url).
   It means that the potential properties depends on the value of the connector.class that has been provided? How to do the mapping between the connector.class value and the corresponding component in the normal Camel Catalog entry which is grouped by component id? Can camel.sink.url and using list of properties used at the same time?
   (might be easier to continue this part of the conversation here https://issues.redhat.com/browse/FUSETOOLS2-670 )
   
   > For the second point, no, they don't need to be renamed. It was just to explain the only options not generated from catalog
   
   If I understand well, these ones are really specific to Camel Kafka Connector AND cannot be derived from normal catalog. Providing the catalog for them will help.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-704532287


   I'm working on this. #515 is related too.
   
   We'll provide camel-kafka-connector options and metadata. Kafka-connect options are out of scope and it is coming from a different library. So it won't be part of the catalog.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] orpiske edited a comment on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
orpiske edited a comment on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690027290


   > additionally to Camel main catalog https://github.com/apache/camel/blob/camel-3.5.0/catalog/camel-catalog/src/generated/resources/org/apache/camel/catalog/main/camel-main-configuration-metadata.json , there are specific Camel Kafka Connector configuration properties, for instance `camel.source.*` and `camel.sink.*` (are there others?)
   
   Sounds interesting! I think we would also need to provide the catalog for the Kafka Connect properties, right? Otherwise it would only complete the Camel-specific ones, but not the Kafka Connect ones we also need to use. 
   
   > 
   > providing a catalog would allow to have completion and hover provided by the Camel Language Server, see [FUSETOOLS2-626](https://issues.redhat.com/browse/FUSETOOLS2-626)
   > 
   > nota: a second step could be to provide validation too
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-692693787


   > To summarize to see if I have understood correctly, you are proposing to create a new camel catalog specific for camel-kafka-connector containing all the information you mentioned?
   > 
   > As has been said that would be more or less a mapping of the original camel catalog with some prefixes for now... at the moment the other info you are listing (converters, transformers, aggregation strategies) are not present.
   
   not exactly, i'm proposing to create the Catalog for everything that is not coming from the original Camel catalog.
   Currently, we cannot provide anything for Camel Kafka Connector (apart from harcoding it but I really don't think that itis a good idea to follow this path). Even for the part that can be "mapped from the original camel catalog", there is missing information in the Properties file to know which Camel connector id needs to be used. See https://issues.redhat.com/browse/FUSETOOLS2-670
   
   > It remains the problem relative to the properties coming directly from kafka connect, did you investigate if there is a way to gathering those from somewhere in kafka?
   
   no. I'm awaiting feedback from Developer Tooling team that should come with plans for Kafka tooling in general. I hope it would include the kafka /kafka Connect properties (but I have absolutely no visibility so far on that, they just told me to wait).
   This is a very separate problem than this issue anyway.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] janstey commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
janstey commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-700279463


   @oscerd @valdar would it be a big job to create this catalog that @apupier is requesting? I think we will need tooling for CKC sooner than later :-)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-705500743


   > @apupier and @lhein can you provide some information about the way you want to consume these metadata and options? So we can provide an API eventually and later.
   
   What I have in mind is:
   * declare dependency on the "Camel Kafka Connector Catalog" artifact that provides the catalog
   * the Camel Language Server is written Java. The ideal would be a method API that is returning the Catalog as Java model. It can also be a method API returning the json as a string and Camel Language Server defines the Java model in our codebase. Or it can also be that the Camel Language Server retrieve the resources in the jar and Camel Language Server defines the Java model in our codebase.
   
   Something even better would be to be able to reuse the Camel Catalog directly which allows to use different versions (but it sounds a lot more work and I do not know how it is feasible)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-705513426


   > Do you mean listing all the connectors artifacts as deps in the camel kafka connector catalog?
   
   no.
   
   
   I imagine having in camel-language-server pom, a dependency on the catalog, something like:
   ```
   <dependency>
     <groupId>org.apache.camel.kafkaconnector</groupId>
     <artifactId>camel-kafka-connector-catalog</artifactId>
   </dependency>
   ```
   
   and then in Camel Language Server code, having an API Method which is returning the loaded Model as a Java Class, something like:
   
   ```
   CamelKafkaConnectorCatalog catalog = new CamelKafkaConnectorCatalog();
   List<CamelKafkaConnectorModel> connectors = catalog.getConnectors();
   
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690032764


   In reality from the camel side we are just reusing the original Camel catalog and add the sink and source prefixes. So I don't know how much it makes sense to repackage another catalog just for adding the same set of information with prefixes in the name.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] orpiske commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
orpiske commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690027290


   > additionally to Camel main catalog https://github.com/apache/camel/blob/camel-3.5.0/catalog/camel-catalog/src/generated/resources/org/apache/camel/catalog/main/camel-main-configuration-metadata.json , there are specific Camel Kafka Connector configuration properties, for instance `camel.source.*` and `camel.sink.*` (are there others?)
   
   Looks interesting! I think we would also need to provide the catalog for the Kafka, right? Otherwise it would only complete the Camel-specific ones, but not the Kafka Connect ones we also need to use. 
   
   > 
   > providing a catalog would allow to have completion and hover provided by the Camel Language Server, see [FUSETOOLS2-626](https://issues.redhat.com/browse/FUSETOOLS2-626)
   > 
   > nota: a second step could be to provide validation too
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-700423446


   It's for sure some work.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-705058568


   Once #536 will be merged, we'll have a list of connectors in connectors.properties under catalog JAR in descriptors
   
   ```
   camel-activemq-source
   camel-activemq-sink
   camel-ahc-sink
   camel-ahc-ws-source
   camel-ahc-ws-sink
   camel-ahc-wss-source
   camel-ahc-wss-sink
   camel-amqp-source
   camel-amqp-sink
   camel-apns-source
   camel-apns-sink
   camel-arangodb-sink
   camel-as2-source
   camel-as2-sink
   camel-asterisk-source
   camel-asterisk-sink
   camel-atmos-source
   ...
   ..
   ..
   ```
   
   and in the connectors folder we'll have a json file for each source/sink connector of the following structure
   
   ```
   {
   	"connector": {
   		"class": "org.apache.camel.kafkaconnector.aws2s3.CamelAws2s3SinkConnector",
   		"artifactId": "camel-aws2-s3-kafka-connector",
   		"groupId": "org.apache.camel.kafkaconnector",
   		"id": "camel-aws2-s3-sink",
   		"type": "sink",
   		"version": "0.6.0-SNAPSHOT"
   	},
   	"properties": {
   		"camel.sink.path.bucketNameOrArn": {
   			"name": "camel.sink.path.bucketNameOrArn",
   			"description": "Bucket name or ARN",
   			"defaultValue": "null",
   			"priority": "HIGH"
   		},
   		"camel.sink.endpoint.amazonS3Client": {
   			"name": "camel.sink.endpoint.amazonS3Client",
   			"description": "Reference to a com.amazonaws.services.s3.AmazonS3 in the registry.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.autoCreateBucket": {
   			"name": "camel.sink.endpoint.autoCreateBucket",
   			"description": "Setting the autocreation of the S3 bucket bucketName. This will apply also in case of moveAfterRead option enabled and it will create the destinationBucket if it doesn't exist already.",
   			"defaultValue": "true",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.autoDiscoverClient": {
   			"name": "camel.sink.endpoint.autoDiscoverClient",
   			"description": "Setting the autoDiscoverClient mechanism, if true, the component will look for a client instance in the registry automatically otherwise it will skip that checking.",
   			"defaultValue": "true",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.overrideEndpoint": {
   			"name": "camel.sink.endpoint.overrideEndpoint",
   			"description": "Set the need for overidding the endpoint. This option needs to be used in combination with uriEndpointOverride option",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.pojoRequest": {
   			"name": "camel.sink.endpoint.pojoRequest",
   			"description": "If we want to use a POJO request as body or not",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.policy": {
   			"name": "camel.sink.endpoint.policy",
   			"description": "The policy for this queue to set in the com.amazonaws.services.s3.AmazonS3#setBucketPolicy() method.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.proxyHost": {
   			"name": "camel.sink.endpoint.proxyHost",
   			"description": "To define a proxy host when instantiating the SQS client",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.proxyPort": {
   			"name": "camel.sink.endpoint.proxyPort",
   			"description": "Specify a proxy port to be used inside the client definition.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.proxyProtocol": {
   			"name": "camel.sink.endpoint.proxyProtocol",
   			"description": "To define a proxy protocol when instantiating the S3 client One of: [HTTP] [HTTPS]",
   			"defaultValue": "\"HTTPS\"",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.region": {
   			"name": "camel.sink.endpoint.region",
   			"description": "The region in which S3 client needs to work. When using this parameter, the configuration will expect the lowercase name of the region (for example ap-east-1) You'll need to use the name Region.EU_WEST_1.id()",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.trustAllCertificates": {
   			"name": "camel.sink.endpoint.trustAllCertificates",
   			"description": "If we want to trust all certificates in case of overriding the endpoint",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.uriEndpointOverride": {
   			"name": "camel.sink.endpoint.uriEndpointOverride",
   			"description": "Set the overriding uri endpoint. This option needs to be used in combination with overrideEndpoint option",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.useIAMCredentials": {
   			"name": "camel.sink.endpoint.useIAMCredentials",
   			"description": "Set whether the S3 client should expect to load credentials on an EC2 instance or to expect static credentials to be passed in.",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.customerAlgorithm": {
   			"name": "camel.sink.endpoint.customerAlgorithm",
   			"description": "Define the customer algorithm to use in case CustomerKey is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.customerKeyId": {
   			"name": "camel.sink.endpoint.customerKeyId",
   			"description": "Define the id of Customer key to use in case CustomerKey is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.customerKeyMD5": {
   			"name": "camel.sink.endpoint.customerKeyMD5",
   			"description": "Define the MD5 of Customer key to use in case CustomerKey is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.deleteAfterWrite": {
   			"name": "camel.sink.endpoint.deleteAfterWrite",
   			"description": "Delete file object after the S3 file has been uploaded",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.keyName": {
   			"name": "camel.sink.endpoint.keyName",
   			"description": "Setting the key name for an element in the bucket through endpoint parameter",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.lazyStartProducer": {
   			"name": "camel.sink.endpoint.lazyStartProducer",
   			"description": "Whether the producer should be started lazy (on the first message). By starting lazy you can use this to allow CamelContext and routes to startup in situations where a producer may otherwise fail during starting and cause the route to fail being started. By deferring this startup to be lazy then the startup failure can be handled during routing messages via Camel's routing error handlers. Beware that when the first message is processed then creating and starting the producer may take a little time and prolong the total processing time of the processing.",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.multiPartUpload": {
   			"name": "camel.sink.endpoint.multiPartUpload",
   			"description": "If it is true, camel will upload the file with multi part format, the part size is decided by the option of partSize",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.operation": {
   			"name": "camel.sink.endpoint.operation",
   			"description": "The operation to do in case the user don't want to do only an upload One of: [copyObject] [listObjects] [deleteObject] [deleteBucket] [listBuckets] [getObject] [getObjectRange]",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.partSize": {
   			"name": "camel.sink.endpoint.partSize",
   			"description": "Setup the partSize which is used in multi part upload, the default size is 25M.",
   			"defaultValue": "26214400L",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.storageClass": {
   			"name": "camel.sink.endpoint.storageClass",
   			"description": "The storage class to set in the com.amazonaws.services.s3.model.PutObjectRequest request.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.awsKMSKeyId": {
   			"name": "camel.sink.endpoint.awsKMSKeyId",
   			"description": "Define the id of KMS key to use in case KMS is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.useAwsKMS": {
   			"name": "camel.sink.endpoint.useAwsKMS",
   			"description": "Define if KMS must be used or not",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.useCustomerKey": {
   			"name": "camel.sink.endpoint.useCustomerKey",
   			"description": "Define if Customer Key must be used or not",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.basicPropertyBinding": {
   			"name": "camel.sink.endpoint.basicPropertyBinding",
   			"description": "Whether the endpoint should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.synchronous": {
   			"name": "camel.sink.endpoint.synchronous",
   			"description": "Sets whether synchronous processing should be strictly used, or Camel is allowed to use asynchronous processing (if supported).",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.accessKey": {
   			"name": "camel.sink.endpoint.accessKey",
   			"description": "Amazon AWS Access Key",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.sink.endpoint.secretKey": {
   			"name": "camel.sink.endpoint.secretKey",
   			"description": "Amazon AWS Secret Key",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.amazonS3Client": {
   			"name": "camel.component.aws2-s3.amazonS3Client",
   			"description": "Reference to a com.amazonaws.services.s3.AmazonS3 in the registry.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.autoCreateBucket": {
   			"name": "camel.component.aws2-s3.autoCreateBucket",
   			"description": "Setting the autocreation of the S3 bucket bucketName. This will apply also in case of moveAfterRead option enabled and it will create the destinationBucket if it doesn't exist already.",
   			"defaultValue": "true",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.autoDiscoverClient": {
   			"name": "camel.component.aws2-s3.autoDiscoverClient",
   			"description": "Setting the autoDiscoverClient mechanism, if true, the component will look for a client instance in the registry automatically otherwise it will skip that checking.",
   			"defaultValue": "true",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.configuration": {
   			"name": "camel.component.aws2-s3.configuration",
   			"description": "The component configuration",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.overrideEndpoint": {
   			"name": "camel.component.aws2-s3.overrideEndpoint",
   			"description": "Set the need for overidding the endpoint. This option needs to be used in combination with uriEndpointOverride option",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.pojoRequest": {
   			"name": "camel.component.aws2-s3.pojoRequest",
   			"description": "If we want to use a POJO request as body or not",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.policy": {
   			"name": "camel.component.aws2-s3.policy",
   			"description": "The policy for this queue to set in the com.amazonaws.services.s3.AmazonS3#setBucketPolicy() method.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.proxyHost": {
   			"name": "camel.component.aws2-s3.proxyHost",
   			"description": "To define a proxy host when instantiating the SQS client",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.proxyPort": {
   			"name": "camel.component.aws2-s3.proxyPort",
   			"description": "Specify a proxy port to be used inside the client definition.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.proxyProtocol": {
   			"name": "camel.component.aws2-s3.proxyProtocol",
   			"description": "To define a proxy protocol when instantiating the S3 client One of: [HTTP] [HTTPS]",
   			"defaultValue": "\"HTTPS\"",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.region": {
   			"name": "camel.component.aws2-s3.region",
   			"description": "The region in which S3 client needs to work. When using this parameter, the configuration will expect the lowercase name of the region (for example ap-east-1) You'll need to use the name Region.EU_WEST_1.id()",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.trustAllCertificates": {
   			"name": "camel.component.aws2-s3.trustAllCertificates",
   			"description": "If we want to trust all certificates in case of overriding the endpoint",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.uriEndpointOverride": {
   			"name": "camel.component.aws2-s3.uriEndpointOverride",
   			"description": "Set the overriding uri endpoint. This option needs to be used in combination with overrideEndpoint option",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.useIAMCredentials": {
   			"name": "camel.component.aws2-s3.useIAMCredentials",
   			"description": "Set whether the S3 client should expect to load credentials on an EC2 instance or to expect static credentials to be passed in.",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.customerAlgorithm": {
   			"name": "camel.component.aws2-s3.customerAlgorithm",
   			"description": "Define the customer algorithm to use in case CustomerKey is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.customerKeyId": {
   			"name": "camel.component.aws2-s3.customerKeyId",
   			"description": "Define the id of Customer key to use in case CustomerKey is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.customerKeyMD5": {
   			"name": "camel.component.aws2-s3.customerKeyMD5",
   			"description": "Define the MD5 of Customer key to use in case CustomerKey is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.deleteAfterWrite": {
   			"name": "camel.component.aws2-s3.deleteAfterWrite",
   			"description": "Delete file object after the S3 file has been uploaded",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.keyName": {
   			"name": "camel.component.aws2-s3.keyName",
   			"description": "Setting the key name for an element in the bucket through endpoint parameter",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.lazyStartProducer": {
   			"name": "camel.component.aws2-s3.lazyStartProducer",
   			"description": "Whether the producer should be started lazy (on the first message). By starting lazy you can use this to allow CamelContext and routes to startup in situations where a producer may otherwise fail during starting and cause the route to fail being started. By deferring this startup to be lazy then the startup failure can be handled during routing messages via Camel's routing error handlers. Beware that when the first message is processed then creating and starting the producer may take a little time and prolong the total processing time of the processing.",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.multiPartUpload": {
   			"name": "camel.component.aws2-s3.multiPartUpload",
   			"description": "If it is true, camel will upload the file with multi part format, the part size is decided by the option of partSize",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.operation": {
   			"name": "camel.component.aws2-s3.operation",
   			"description": "The operation to do in case the user don't want to do only an upload One of: [copyObject] [listObjects] [deleteObject] [deleteBucket] [listBuckets] [getObject] [getObjectRange]",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.partSize": {
   			"name": "camel.component.aws2-s3.partSize",
   			"description": "Setup the partSize which is used in multi part upload, the default size is 25M.",
   			"defaultValue": "26214400L",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.storageClass": {
   			"name": "camel.component.aws2-s3.storageClass",
   			"description": "The storage class to set in the com.amazonaws.services.s3.model.PutObjectRequest request.",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.awsKMSKeyId": {
   			"name": "camel.component.aws2-s3.awsKMSKeyId",
   			"description": "Define the id of KMS key to use in case KMS is enabled",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.useAwsKMS": {
   			"name": "camel.component.aws2-s3.useAwsKMS",
   			"description": "Define if KMS must be used or not",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.useCustomerKey": {
   			"name": "camel.component.aws2-s3.useCustomerKey",
   			"description": "Define if Customer Key must be used or not",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.basicPropertyBinding": {
   			"name": "camel.component.aws2-s3.basicPropertyBinding",
   			"description": "Whether the component should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities",
   			"defaultValue": "false",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.accessKey": {
   			"name": "camel.component.aws2-s3.accessKey",
   			"description": "Amazon AWS Access Key",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		},
   		"camel.component.aws2-s3.secretKey": {
   			"name": "camel.component.aws2-s3.secretKey",
   			"description": "Amazon AWS Secret Key",
   			"defaultValue": "null",
   			"priority": "MEDIUM"
   		}
   	}
   }
   ```
   
   @apupier and @lhein can you provide some information about the way you want to consume these metadata and options? So we can provide an API eventually and later.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690035293


   > In reality from the camel side we are just reusing the original Camel catalog and add the sink and source prefixes. So I don't know how much it makes sense to repackage another catalog just for adding the same set of information with prefixes in the name.
   
   that would be easier.
   
   For instance, from where `maxPollDuration` should be taken from?
   https://github.com/apache/camel-kafka-connector/blob/15f7fb1f40f92ed66f1cbbfd62bc6521079afd41/examples/CamelAWSS3SourceConnector.properties#L25
   i don't find it in the Camel main catalog
   
   isn't all properties here Camel Kafka Connector specific? https://github.com/apache/camel-kafka-connector/blob/81edd4b9a424fe4a06e6719e6d08e037c87c08c4/core/src/main/java/org/apache/camel/kafkaconnector/CamelSourceConnectorConfig.java


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] orpiske edited a comment on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
orpiske edited a comment on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690027290


   > additionally to Camel main catalog https://github.com/apache/camel/blob/camel-3.5.0/catalog/camel-catalog/src/generated/resources/org/apache/camel/catalog/main/camel-main-configuration-metadata.json , there are specific Camel Kafka Connector configuration properties, for instance `camel.source.*` and `camel.sink.*` (are there others?)
   
   Sounds interesting! I think we would also need to provide the catalog for the Kafka, right? Otherwise it would only complete the Camel-specific ones, but not the Kafka Connect ones we also need to use. 
   
   > 
   > providing a catalog would allow to have completion and hover provided by the Camel Language Server, see [FUSETOOLS2-626](https://issues.redhat.com/browse/FUSETOOLS2-626)
   > 
   > nota: a second step could be to provide validation too
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-705504576






----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690041327


   There is a mixture of kafka-connect specific option, camel options and camel component/endpoint options.
   
   The camel component/endpoint options are exactly the same of the camel main catalog plus prefixes. The maxPollDuration is a specific option of a polling consumer and it's named in that way in ckc. The options in sink and source connector (generic classes) are not generated, so what you see in CamelSourceConnectorConfig and CamelSinkConnectorConfig have been defined without taking into account the catalog content.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd closed issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd closed issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690062511


   https://camel.apache.org/camel-kafka-connector/latest/connectors/camel-activemq-kafka-sink-connector.html
   
   All the camel.sink.* stuff or camel.source.* stuff are camel catalog options renamed with sink or source in the name 
   
   For the second point, no, they don't need to be renamed. It was just to explain the only options not generated from catalog 
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690032856


   > I think we would also need to provide the catalog for the Kafka Connect properties, right? Otherwise it would only complete the Camel-specific ones, but not the Kafka Connect ones we also need to use.
   
   i would expect kafka tooling not Camel specific to provide it. But if it does not exists, we can also work on that.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier edited a comment on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier edited a comment on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690071753


   > https://camel.apache.org/camel-kafka-connector/latest/connectors/camel-activemq-kafka-sink-connector.html
   > 
   > All the camel.sink.* stuff or camel.source.* stuff are camel catalog options renamed with sink or source in the name
   
   ok, just discovered for other options of camel.sink.* and camel.source.* (apart from camel.source.url and camel.sink.url).
   It means that the potential properties depends on the value of the connector.class that has been provided? How to do the mapping between the connector.class value and the corresponding component in the normal Camel Catalog entry which is grouped by component id? Can camel.sink.url and using list of properties used at the same time?
   (might be easier to continue this part of the conversation here https://issues.redhat.com/browse/FUSETOOLS2-670 )
   
   > For the second point, no, they don't need to be renamed. It was just to explain the only options not generated from catalog
   
   If I understand well, these ones are really specific to Camel Kafka Connector AND cannot be derived from normal catalog. Providing the catalog for them will help.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] valdar commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
valdar commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-692689499


   @apupier sorry to jump this late in the conversation.
   
   To summarize to see if I have understood correctly, you are proposing to create a new camel catalog specific for camel-kafka-connector containing all the information you mentioned?
   
   As has been said that would be more or less a mapping of the original camel catalog with some prefixes for now... at the moment the other info you are listing (converters, transformers, aggregation strategies) are not present.
   
   It remains the problem relative to the properties coming directly from kafka connect, did you investigate if there is a way to gathering those from somewhere in kafka?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier edited a comment on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier edited a comment on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690058374


   > The camel component/endpoint options are exactly the same of the camel main catalog plus prefixes.
   
   Can you precise which prefixes and points to an example?
   Based on examples I found, it sounds like exactly the same than normal catalog, so writing the same than for Properties file for Camel main, for instance [camel.component.amqp.includeAmqpAnnotations=true](https://github.com/apache/camel-kafka-connector/blob/15f7fb1f40f92ed66f1cbbfd62bc6521079afd41/examples/CamelAmqpSourceConnector.properties#L26) doesn't contain anything specific to Camel Kafka Connector.
   
   > what you see in CamelSourceConnectorConfig and CamelSinkConnectorConfig have been defined without taking into account the catalog content.
   
   Do you mean that these configurations should be modified to reuse existing catalog content?
   I thought it was specific Camel Kafka Connector properties, so having a catalog for what is defined inside these 2 classes would help providing completion.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier edited a comment on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier edited a comment on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-692693787


   > To summarize to see if I have understood correctly, you are proposing to create a new camel catalog specific for camel-kafka-connector containing all the information you mentioned?
   > 
   > As has been said that would be more or less a mapping of the original camel catalog with some prefixes for now... at the moment the other info you are listing (converters, transformers, aggregation strategies) are not present.
   
   not exactly, i'm proposing to create the Catalog for everything that is not coming from the original Camel catalog.
   Currently, we cannot provide anything for Camel Kafka Connector (apart from harcoding it but I really don't think that itis a good idea to follow this path). Even for the part that can be "mapped from the original camel catalog", there is missing information in the Properties file to know which Camel connector id needs to be used. See https://issues.redhat.com/browse/FUSETOOLS2-670
   
   > It remains the problem relative to the properties coming directly from kafka connect, did you investigate if there is a way to gathering those from somewhere in kafka?
   
   no. I'm awaiting feedback from Developer Tooling team that should come with plans for Kafka tooling in general. I hope it would include the kafka /kafka Connect properties (but I have absolutely no visibility so far on that, they just told me to wait).
   This is a very separate problem than this issue anyway. EDIT: created https://issues.redhat.com/browse/FUSETOOLS2-699 to track it


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-690058374


   > The camel component/endpoint options are exactly the same of the camel main catalog plus prefixes.
   
   Can you precise which prefixes and points to an example?
   Based on examples I found, it sounds like exactly the same than normal catalog, so writing the same than for Properties file for Camel main, for instance [camel.component.aws-kinesis.configuration.access-key=<youraccesskey>](https://github.com/apache/camel-kafka-connector/blob/15f7fb1f40f92ed66f1cbbfd62bc6521079afd41/examples/CamelAWSKinesisSourceConnector.properties#L24) doesn't contain anything specific to Camel Kafka Connector.
   
   > what you see in CamelSourceConnectorConfig and CamelSinkConnectorConfig have been defined without taking into account the catalog content.
   
   Do you mean that these configurations should be modified to reuse existing catalog content?
   I thought it was specific Camel Kafka Connector properties, so having a catalog for what is defined inside these 2 classes would help providing completion.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-705500743






----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-692708095


   Personally I think repackaging a catalog could be an important improvement, but not critical currently. We have a lot of more crucial features to do. So we can leave this open and re-start the discussion once we have free cycle for this.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] apupier commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
apupier commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-691877418


   I think the catalog will be useful for several cases:
   - properties specific to Camel Kafka Connector, the ones that are in CamelSourceConnectorConfig and CamelSinkConnectorConfig. Can be leveraged then in Camel Language Server  [FUSETOOLS2-626](https://issues.redhat.com/browse/FUSETOOLS2-626)
   - provide the mapping between the connector.class and the Camel Connector id to be able to know which Camel Connector catalog to use for camel.sink.* and camel.source.* properties. Can be leveraged then in Camel Language Server [[FUSETOOLS2-670]](https://issues.redhat.com/browse/FUSETOOLS2-670)
   - provide the available Converters. Can be leveraged then in Camel Language Server [FUSETOOLS-686](https://issues.redhat.com/browse/FUSETOOLS2-686)
   - provide the available Transformers Can be leveraged then in Camel Language Server [FUSETOOLS-687](https://issues.redhat.com/browse/FUSETOOLS2-687)
   - provide the available Aggregation Strategies  Can be leveraged then in Camel Language Server [FUSETOOLS-688](https://issues.redhat.com/browse/FUSETOOLS2-688)
    - have the list of supported Camel connector. As there are some "new" connectors marked in [What's new 0.5.0](https://camel.apache.org/blog/2020/09/Camel-kafka-connector-050-Whatsnew/), I'm guessing that there is not a 100% match with Camel main catalog.
   
   I think that most of them can be implemented incrementally and separately.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-714359780


   Completed for 0.6.0


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [camel-kafka-connector] oscerd commented on issue #430: Provide a catalog of potential configuration properties specific to Camel Kafka Connector

Posted by GitBox <gi...@apache.org>.
oscerd commented on issue #430:
URL: https://github.com/apache/camel-kafka-connector/issues/430#issuecomment-705518294


   ok, I'll expose something like that. Thanks for the clarification.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org