You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by Dylan Forciea <dy...@oseberg.io> on 2020/10/22 13:58:38 UTC

Re: NullPointerException when trying to read null array in Postgres using JDBC Connector

Danny,

Thanks! I have created a new JIRA issue [1]. I’ll look into how hard it is to get a patch and unit test myself, although I may need a hand on the process of making a change to both the master branch and a release branch if it is desired to get a fix into 1.11.

Regards,
Dylan Forciea

[1] https://issues.apache.org/jira/browse/FLINK-19771

From: Danny Chan <da...@apache.org>
Date: Thursday, October 22, 2020 at 4:34 AM
To: Dylan Forciea <dy...@oseberg.io>
Cc: Flink ML <us...@flink.apache.org>
Subject: Re: NullPointerException when trying to read null array in Postgres using JDBC Connector

Yes, the current code throws directly for NULLs, can you log an issue there ?

Dylan Forciea <dy...@oseberg.io>> 于2020年10月21日周三 上午4:30写道:
I believe I am getting an error because I have a nullable postgres array of text that is set to NULL that I’m reading using the JDBC SQL Connector. Is this something that should be allowed? Looking at the source code line below, it doesn’t look like the case of an array being null would be handled.

[error] Caused by: java.io.IOException: Couldn't access resultSet
[error]   at org.apache.flink.connector.jdbc.table.JdbcRowDataInputFormat.nextRecord(JdbcRowDataInputFormat.java:266)
[error]   at org.apache.flink.connector.jdbc.table.JdbcRowDataInputFormat.nextRecord(JdbcRowDataInputFormat.java:57)
[error]   at org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:91)
[error]   at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
[error]   at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
[error]   at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
[error] Caused by: java.lang.NullPointerException
[error]   at org.apache.flink.connector.jdbc.internal.converter.PostgresRowConverter.lambda$createPostgresArrayConverter$c06ce9f4$2(PostgresRowConverter.java:97)
[error]   at org.apache.flink.connector.jdbc.internal.converter.AbstractJdbcRowConverter.toInternal(AbstractJdbcRowConverter.java:79)
[error]   at org.apache.flink.connector.jdbc.table.JdbcRowDataInputFormat.nextRecord(JdbcRowDataInputFormat.java:259)
[error]   ... 5 more

Thanks,
Dylan Forciea

Re: NullPointerException when trying to read null array in Postgres using JDBC Connector

Posted by Dylan Forciea <dy...@oseberg.io>.
Looking at this, it’s a simple enough fix. My question would just be around a unit test that tests this particular bug. It doesn’t look like there is anything that directly tests the Postgres row converter. There is a test that utilizes Derby, but it looks like only Postgres supports arrays as far as I can tell.

I could create a unit test for the PostgresRowConverter and test the array functionality in it.  If that sounds like a good plan, I’d be willing to create a PR to fix this issue.

Regards,
Dylan Forciea

From: Dylan Forciea <dy...@oseberg.io>
Date: Thursday, October 22, 2020 at 8:58 AM
To: Danny Chan <da...@apache.org>
Cc: "user@flink.apache.org" <us...@flink.apache.org>, "dev@flink.apache.org" <de...@flink.apache.org>
Subject: Re: NullPointerException when trying to read null array in Postgres using JDBC Connector

Danny,

Thanks! I have created a new JIRA issue [1]. I’ll look into how hard it is to get a patch and unit test myself, although I may need a hand on the process of making a change to both the master branch and a release branch if it is desired to get a fix into 1.11.

Regards,
Dylan Forciea

[1] https://issues.apache.org/jira/browse/FLINK-19771

From: Danny Chan <da...@apache.org>
Date: Thursday, October 22, 2020 at 4:34 AM
To: Dylan Forciea <dy...@oseberg.io>
Cc: Flink ML <us...@flink.apache.org>
Subject: Re: NullPointerException when trying to read null array in Postgres using JDBC Connector

Yes, the current code throws directly for NULLs, can you log an issue there ?

Dylan Forciea <dy...@oseberg.io>> 于2020年10月21日周三 上午4:30写道:
I believe I am getting an error because I have a nullable postgres array of text that is set to NULL that I’m reading using the JDBC SQL Connector. Is this something that should be allowed? Looking at the source code line below, it doesn’t look like the case of an array being null would be handled.

[error] Caused by: java.io.IOException: Couldn't access resultSet
[error]   at org.apache.flink.connector.jdbc.table.JdbcRowDataInputFormat.nextRecord(JdbcRowDataInputFormat.java:266)
[error]   at org.apache.flink.connector.jdbc.table.JdbcRowDataInputFormat.nextRecord(JdbcRowDataInputFormat.java:57)
[error]   at org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:91)
[error]   at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
[error]   at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
[error]   at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
[error] Caused by: java.lang.NullPointerException
[error]   at org.apache.flink.connector.jdbc.internal.converter.PostgresRowConverter.lambda$createPostgresArrayConverter$c06ce9f4$2(PostgresRowConverter.java:97)
[error]   at org.apache.flink.connector.jdbc.internal.converter.AbstractJdbcRowConverter.toInternal(AbstractJdbcRowConverter.java:79)
[error]   at org.apache.flink.connector.jdbc.table.JdbcRowDataInputFormat.nextRecord(JdbcRowDataInputFormat.java:259)
[error]   ... 5 more

Thanks,
Dylan Forciea