You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@camel.apache.org by "jaehyeon-kim (via GitHub)" <gi...@apache.org> on 2023/05/25 21:09:27 UTC

[GitHub] [camel-kafka-connector] jaehyeon-kim opened a new issue, #1532: [ERROR] DynamoDB sink connector - No type converter available

jaehyeon-kim opened a new issue, #1532:
URL: https://github.com/apache/camel-kafka-connector/issues/1532

   Hello,
   
   I see the following error when I started the dynamodb sink connector.
   
   The root cause seems to be `Caused by: org.apache.camel.NoTypeConversionAvailableException: No type converter available to convert from type: java.util.HashMap to the required type: java.io.InputStream`.
   
   ```
   [2023-05-25 20:59:26,090] ERROR WorkerSinkTask{id=order-sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:190)
   org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
   	at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:610)
   	at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:330)
   	at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232)
   	at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201)
   	at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
   	at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:237)
   	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
   	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
   	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
   	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
   	at java.base/java.lang.Thread.run(Thread.java:829)
   Caused by: org.apache.kafka.connect.errors.ConnectException: Exchange delivery has failed!
   	at org.apache.camel.kafkaconnector.CamelSinkTask.put(CamelSinkTask.java:210)
   	at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:582)
   	... 10 more
   Caused by: org.apache.camel.InvalidPayloadException: No body available of type: java.io.InputStream but has type: java.util.HashMap on: Message. Caused by: No type converter available to convert from type: java.util.HashMap to the required type: java.io.InputStream. Exchange[E65CCFAA9CDE458-0000000000000000]. Caused by: [org.apache.camel.NoTypeConversionAvailableException - No type converter available to convert from type: java.util.HashMap to the required type: java.io.InputStream]
   	at org.apache.camel.support.MessageSupport.getMandatoryBody(MessageSupport.java:125)
   	at org.apache.camel.support.processor.UnmarshalProcessor.process(UnmarshalProcessor.java:70)
   	at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$SimpleTask.run(RedeliveryErrorHandler.java:477)
   	at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:189)
   	at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:61)
   	at org.apache.camel.processor.Pipeline.process(Pipeline.java:182)
   	at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:399)
   	at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:96)
   	at org.apache.camel.impl.engine.SharedCamelInternalProcessor.process(SharedCamelInternalProcessor.java:214)
   	at org.apache.camel.impl.engine.SharedCamelInternalProcessor$1.process(SharedCamelInternalProcessor.java:111)
   	at org.apache.camel.impl.engine.DefaultAsyncProcessorAwaitManager.process(DefaultAsyncProcessorAwaitManager.java:83)
   	at org.apache.camel.impl.engine.SharedCamelInternalProcessor.process(SharedCamelInternalProcessor.java:108)
   	at org.apache.camel.support.cache.DefaultProducerCache.send(DefaultProducerCache.java:199)
   	at org.apache.camel.impl.engine.DefaultProducerTemplate.send(DefaultProducerTemplate.java:176)
   	at org.apache.camel.impl.engine.DefaultProducerTemplate.send(DefaultProducerTemplate.java:148)
   	at org.apache.camel.kafkaconnector.CamelSinkTask.put(CamelSinkTask.java:205)
   	... 11 more
   Caused by: org.apache.camel.NoTypeConversionAvailableException: No type converter available to convert from type: java.util.HashMap to the required type: java.io.InputStream
   	at org.apache.camel.impl.converter.CoreTypeConverterRegistry.mandatoryConvertTo(CoreTypeConverterRegistry.java:274)
   	at org.apache.camel.support.MessageSupport.getMandatoryBody(MessageSupport.java:123)
   	... 26 more
   ```
   
   Below shows a sample topic message.
   
   key
   ```
   {
   	"order_id": "2eace6b6-898d-4ea9-88ac-fecebec30bed"
   }
   ```
   
   value
   ```
   {
   	"quantity": 2,
   	"product_id": "0894570382",
   	"order_id": "2eace6b6-898d-4ea9-88ac-fecebec30bed",
   	"customer_id": 183,
   	"customer_name": "Milda Nitzsche"
   }
   ```
   
   Also the following connector configuration is used.
   
   ```
   {
     "name": "order-sink",
     "config": {
       "connector.class": "org.apache.camel.kafkaconnector.awsddbsink.CamelAwsddbsinkSinkConnector",
       "tasks.max": "1",
       "key.converter": "org.apache.kafka.connect.json.JsonConverter",
       "key.converter.schemas.enable": false,
       "value.converter": "org.apache.kafka.connect.json.JsonConverter",
       "value.converter.schemas.enable": false,
       "topics": "order",
       "camel.kamelet.aws-ddb-sink.table": "orders-short",
       "camel.kamelet.aws-ddb-sink.region": "ap-southeast-2",
       "camel.kamelet.aws-ddb-sink.operation": "PutItem",
       "camel.kamelet.aws-ddb-sink.writeCapacity": 1,
       "camel.kamelet.aws-ddb-sink.useDefaultCredentialsProvider": false,
       "camel.kamelet.aws-ddb-sink.accessKey": "accessKey",
       "camel.kamelet.aws-ddb-sink.secretKey": "secretKey"
     }
   }
   ```
   
   Can you please inform how to fix the issue?
   
   Cheers,
   Jaehyeon


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@camel.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [camel-kafka-connector] jaehyeon-kim commented on issue #1532: [ERROR] DynamoDB sink connector - No type converter available

Posted by "jaehyeon-kim (via GitHub)" <gi...@apache.org>.
jaehyeon-kim commented on issue #1532:
URL: https://github.com/apache/camel-kafka-connector/issues/1532#issuecomment-1563839043

   I'm very new to Camel. Just found marshal/unmarsal stuff from previous issues.
   
   Based on the following error message, I guess I have to set this option? `camel.sink.marshal=json-jackson`
   
   ```
   Caused by: org.apache.camel.NoTypeConversionAvailableException: No type converter available to convert from type: java.util.HashMap to the required type: java.io.InputStream
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@camel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [camel-kafka-connector] jaehyeon-kim closed issue #1532: [ERROR] DynamoDB sink connector - No type converter available

Posted by "jaehyeon-kim (via GitHub)" <gi...@apache.org>.
jaehyeon-kim closed issue #1532: [ERROR] DynamoDB sink connector - No type converter available
URL: https://github.com/apache/camel-kafka-connector/issues/1532


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@camel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [camel-kafka-connector] jaehyeon-kim commented on issue #1532: [ERROR] DynamoDB sink connector - No type converter available

Posted by "jaehyeon-kim (via GitHub)" <gi...@apache.org>.
jaehyeon-kim commented on issue #1532:
URL: https://github.com/apache/camel-kafka-connector/issues/1532#issuecomment-1564822876

   It works with `camel.sink.umarshal=jackson` option with indicated in https://github.com/apache/camel-kafka-connector/blob/main/connectors/camel-aws-ddb-sink-kafka-connector/src/main/resources/kamelets/aws-ddb-sink.kamelet.yaml.
   
   Here is the final configuration.
   
   ```
   {
     "name": "order-sink",
     "config": {
       "connector.class": "org.apache.camel.kafkaconnector.awsddbsink.CamelAwsddbsinkSinkConnector",
       "tasks.max": "1",
       "key.converter": "org.apache.kafka.connect.storage.StringConverter",
       "key.converter.schemas.enable": false,
       "value.converter": "org.apache.kafka.connect.json.JsonConverter",
       "value.converter.schemas.enable": false,
       "topics": "order",
       "camel.kamelet.aws-ddb-sink.table": "orders",
       "camel.kamelet.aws-ddb-sink.region": "ap-southeast-2",
       "camel.kamelet.aws-ddb-sink.operation": "PutItem",
       "camel.kamelet.aws-ddb-sink.writeCapacity": 1,
       "camel.kamelet.aws-ddb-sink.useDefaultCredentialsProvider": true,
       "camel.sink.unmarshal": "jackson"
     }
   }
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@camel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org