You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Vladoiu Catalin <vl...@gmail.com> on 2017/09/07 09:50:22 UTC
StackOverflowError for Connect WorkerTask
Hi guys,
I am trying to use Confluent Platform 3.3.0 and the S3-Connector and I get
a StackOverflowError error:
java.lang.StackOverflowError
at java.util.HashMap.hash(HashMap.java:338)
at java.util.LinkedHashMap.get(LinkedHashMap.java:440)
at org.apache.avro.JsonProperties.getJsonProp(JsonProperties.java:141)
at org.apache.avro.JsonProperties.getProp(JsonProperties.java:130)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1258)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359)
with the following schema structure:
@namespace ("com.test.avro")
protocol TestError {
record TestError {
union { null, string } type = null;
union { null, array<TestError> } errors = null;
}
}
I wrote a basic unit test for AvroData.toConnectSchema(...) and that one is
also failing with the same error:
@Test
public void testToConnectDateAvroCustomEanError() {
org.apache.avro.Schema avroSchema = new
org.apache.avro.Schema.Parser().parse("{\"type\":\"record\",\"name\":\"TestError\",\"namespace\":\"com.test.avro\",\"fields\":[{\"name\":\"type\",\"type\":[\"null\",{\"type\":\"string\",\"avro.java.string\":\"String\"}],\"default\":null},{\"name\":\"errors\",\"type\":[\"null\",{\"type\":\"array\",\"items\":\"TestError\"}],\"default\":null}]}");
avroData.toConnectData(avroSchema, 10000);
}
Is this a known issue, or I am doing something wrong?
Thanks,
Catalin
Fwd: StackOverflowError for Connect WorkerTask
Posted by Vladoiu Catalin <vl...@gmail.com>.
Hi guys,
I am trying to use Confluent Platform 3.3.0 and the S3-Connector and I get
a StackOverflowError error:
java.lang.StackOverflowError
at java.util.HashMap.hash(HashMap.java:338)
at java.util.LinkedHashMap.get(LinkedHashMap.java:440)
at org.apache.avro.JsonProperties.getJsonProp(JsonProperties.java:141)
at org.apache.avro.JsonProperties.getProp(JsonProperties.java:130)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1258)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359)
with the following schema structure:
@namespace ("com.test.avro")
protocol TestError {
record TestError {
union { null, string } type = null;
union { null, array<TestError> } errors = null;
}
}
I wrote a basic unit test for AvroData.toConnectSchema(...) and that one is
also failing with the same error:
@Test
public void testToConnectDateAvroCustomEanError() {
org.apache.avro.Schema avroSchema = new
org.apache.avro.Schema.Parser().parse("{\"type\":\"record\",\"name\":\"TestError\",\"namespace\":\"com.test.avro\",\"fields\":[{\"name\":\"type\",\"type\":[\"null\",{\"type\":\"string\",\"avro.java.string\":\"String\"}],\"default\":null},{\"name\":\"errors\",\"type\":[\"null\",{\"type\":\"array\",\"items\":\"TestError\"}],\"default\":null}]}");
avroData.toConnectData(avroSchema, 10000);
}
Is this a known issue, or I am doing something wrong?
Thanks,
Catalin