You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by "omri.alon@myheritage.com" <om...@myheritage.com> on 2017/08/03 14:01:26 UTC

Kafka-connect-jdbc: problem with connecting to RedShift.

Hey all.
I'm trying to activate the jdbc sink connector,
and I keep getting the following error:

ERROR Task test-redshift-sink-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerSinkTask:449)
org.apache.kafka.connect.errors.ConnectException: No fields found using key and value schemas for table: kafka_test


Do you guys have any idea what could be the issue?
Thanks a lot!


schema-registry properties file:
==================
bootstrap.servers=kafka10test:9092

# The converters specify the format of data in Kafka and how to translate it into Connect data.
# Every Connect user will need to configure these based on the format they want their data in
# when loaded from or stored into Kafka
#key.converter=io.confluent.connect.avro.AvroConverter
#key.converter.schema.registry.url=http://localho:8081


key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://kafka10test:8081

enhanced.avro.schema.support=true

rest.port=8090
# The internal converter used for offsets and config data is configurable and must be specified,
# but most users will always want to use the built-in default. Offset and config data is never
# visible outside of Connect in this format.
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

# Local storage file for offset data
offset.storage.file.filename=/tmp/connect.offsets
==================


sink connector properties file:
==================
name=test-redshift-sink
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1

# The topics to consume from - required for sink connectors like this one
topics=cdc_system

value.converter.schema.registry.url=http://kafka10test:8081

# Configuration specific to the JDBC sink connector.
# We want to connect to a SQLite database stored in the file test.db and autqo-create tables.
connection.url=jdbc:postgresql://host
connection.user=USER
connection.password=PASS

key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://kafka10test:8081

enhanced.avro.schema.support=true

# Create a table if not exists.
auto.create=false
auto.evolve=false
# Insert or Upsert the data (Only when possible)
insert.mode=insert

# Table name to insert data into
table.name.format=kafka_test
======================