You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Chesnay Schepler <ch...@apache.org> on 2022/01/11 14:14:49 UTC

Re: Could not find any factory for identifier 'jdbc'

How do you ensure that the connector is actually available at runtime? 
Are you bundling it in a jar or putting it into Flinks lib directory?

On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> Correcting subject -> Could not find any factory for identifier 'jdbc'
>
> From: Ronak Beejawat (rbeejawa)
> Sent: Tuesday, January 11, 2022 6:43 PM
> To: 'dev@flink.apache.org' <de...@flink.apache.org>; 'community@flink.apache.org' <co...@flink.apache.org>; 'user@flink.apache.org' <us...@flink.apache.org>
> Cc: 'Hang Ruan' <ru...@gmail.com>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>; Krishna Singitam (ksingita) <ks...@cisco.com>; Arun Yadav (aruny) <ar...@cisco.com>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>; Avi Sanwal (asanwal) <as...@cisco.com>
> Subject: what is efficient way to write Left join in flink
>
> Hi Team,
>
> Getting below exception while using jdbc connector :
>
> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
>
> Available factory identifiers are:
>
> blackhole
> datagen
> filesystem
> kafka
> print
> upsert-kafka
>
>
> I have already added dependency for jdbc connector in pom.xml as mentioned below:
>
> <dependency>
> <groupId>org.apache.flink</groupId>
>         <artifactId>flink-connector-jdbc_2.11</artifactId>
>         <version>1.14.2</version>
> </dependency>
> <dependency>
> <groupId>mysql</groupId>
>         <artifactId>mysql-connector-java</artifactId>
>         <version>5.1.41</version>
> </dependency>
>
> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
>
>
>
> Please help me on this and provide the solution for it !!!
>
>
> Thanks
> Ronak Beejawat



Re: Could not find any factory for identifier 'jdbc'

Posted by Chesnay Schepler <ch...@apache.org>.
Are you using the maven-jar-plugin to create the jar?

My suspicion is that the META-INF/services are not being properly 
merged. I'd suggest to use the maven-shade-plugin as shown in the 
quickstarts.

On 13/01/2022 05:34, Ronak Beejawat (rbeejawa) wrote:
>
> Hi Roman, Chesnay
>
> PFB screenshot for jdbc connector availability in bundle jar as I 
> mentioned earlier it didn’t worked even than, so I tried putting it in 
> inside flink lib directory as mentioned in below article link then it 
> resolved the issue.
>
>
> @Roman – even I tried with flink-connector-jdbc_2.12 it didn’t worked .
>
> Thanks
>
> Ronak Beejawat
>
> *From: *Roman Khachatryan <ro...@apache.org>
> *Date: *Wednesday, 12 January 2022 at 6:57 PM
> *To: *community@flink.apache.org <co...@flink.apache.org>
> *Cc: *dev <de...@flink.apache.org>, Ronak Beejawat (rbeejawa) 
> <rb...@cisco.com.invalid>, user@flink.apache.org 
> <us...@flink.apache.org>, Hang Ruan <ru...@gmail.com>, Shrinath 
> Shenoy K (sshenoyk) <ss...@cisco.com>, Karthikeyan Muthusamy 
> (karmuthu) <ka...@cisco.com>, Krishna Singitam (ksingita) 
> <ks...@cisco.com>, Arun Yadav (aruny) <ar...@cisco.com>, 
> Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>, Avi Sanwal 
> (asanwal) <as...@cisco.com>
> *Subject: *Re: Could not find any factory for identifier 'jdbc'
>
> Hi,
>
> I think Chesnay's suggestion to double-check the bundle makes sense.
> Additionally, I'd try flink-connector-jdbc_2.12 instead of
> flink-connector-jdbc_2.11.
>
> Regards,
> Roman
>
> On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <ch...@apache.org> 
> wrote:
> >
> > I would try double-checking whether the jdbc connector was truly bundled
> > in your jar, specifically whether
> > org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
> >
> > I can't think of a reason why this shouldn't work for the JDBC 
> connector.
> >
> > On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > > Hi Chesnay,
> > >
> > > How do you ensure that the connector is actually available at runtime?
> > >
> > > We are providing below mentioned dependency inside pom.xml with 
> scope compile that will be available in class path and it was there in 
> my fink job bundled jar. Same we are doing the same for other 
> connector say kafka it worked for that
> > >
> > > <dependency>
> > > <groupId>org.apache.flink</groupId>
> > > <artifactId>flink-connector-jdbc_2.11</artifactId>
> > > <version>1.14.2</version>
> > > </dependency>
> > > <dependency>
> > > <groupId>mysql</groupId>
> > > <artifactId>mysql-connector-java</artifactId>
> > > <version>5.1.41</version>
> > > </dependency>
> > >
> > > Are you bundling it in a jar or putting it into Flinks lib directory?
> > > Yes we are building jar it is bundled with that but still we saw 
> this error . So we tried the workaround which is mentioned in some 
> article to put inside a flink lib directory then it worked 
> https://blog.csdn.net/weixin_44056920/article/details/118110949 . So 
> this is extra stuff which we have to do to make it work with restart 
> of cluster .
> > >
> > > But the question is how it worked for kafka and not for jdbc ? I 
> didn't put kafka jar explicitly in flink lib folder
> > >
> > > Note : I am using flink release 1.14 version for all my job 
> execution / implementation which is a stable version I guess
> > >
> > > Thanks
> > > Ronak Beejawat
> > > From: Chesnay Schepler 
> <chesnay@apache.org<mailto:chesnay@apache.org 
> <ma...@apache.org>>>
> > > Date: Tuesday, 11 January 2022 at 7:45 PM
> > > To: Ronak Beejawat (rbeejawa) 
> <rbeejawa@cisco.com.INVALID<mailto:rbeejawa@cisco.com.INVALID 
> <ma...@cisco.com.INVALID>>>, 
> user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>> 
> <user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>>>
> > > Cc: Hang Ruan 
> <ruanhang1993@gmail.com<mailto:ruanhang1993@gmail.com 
> <ma...@gmail.com>>>, 
> Shrinath Shenoy K (sshenoyk) 
> <sshenoyk@cisco.com<mailto:sshenoyk@cisco.com 
> <ma...@cisco.com>>>, Karthikeyan 
> Muthusamy (karmuthu) <karmuthu@cisco.com<mailto:karmuthu@cisco.com 
> <ma...@cisco.com>>>, Krishna 
> Singitam (ksingita) <ksingita@cisco.com<mailto:ksingita@cisco.com 
> <ma...@cisco.com>>>, Arun Yadav 
> (aruny) <aruny@cisco.com<mailto:aruny@cisco.com 
> <ma...@cisco.com>>>, Jayaprakash 
> Kuravatti (jkuravat) <jkuravat@cisco.com<mailto:jkuravat@cisco.com 
> <ma...@cisco.com>>>, Avi Sanwal 
> (asanwal) <asanwal@cisco.com<mailto:asanwal@cisco.com 
> <ma...@cisco.com>>>
> > > Subject: Re: Could not find any factory for identifier 'jdbc'
> > > How do you ensure that the connector is actually available at runtime?
> > > Are you bundling it in a jar or putting it into Flinks lib directory?
> > >
> > > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> > >> Correcting subject -> Could not find any factory for identifier 
> 'jdbc'
> > >>
> > >> From: Ronak Beejawat (rbeejawa)
> > >> Sent: Tuesday, January 11, 2022 6:43 PM
> > >> To: 'dev@flink.apache.org' 
> <dev@flink.apache.org<mailto:dev@flink.apache.org 
> <ma...@flink.apache.org>>>; 
> 'community@flink.apache.org' 
> <community@flink.apache.org<mailto:community@flink.apache.org 
> <ma...@flink.apache.org>>>; 
> 'user@flink.apache.org' 
> <user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>>>
> > >> Cc: 'Hang Ruan' 
> <ruanhang1993@gmail.com<mailto:ruanhang1993@gmail.com 
> <ma...@gmail.com>>>; 
> Shrinath Shenoy K (sshenoyk) 
> <sshenoyk@cisco.com<mailto:sshenoyk@cisco.com 
> <ma...@cisco.com>>>; Karthikeyan 
> Muthusamy (karmuthu) <karmuthu@cisco.com<mailto:karmuthu@cisco.com 
> <ma...@cisco.com>>>; Krishna 
> Singitam (ksingita) <ksingita@cisco.com<mailto:ksingita@cisco.com 
> <ma...@cisco.com>>>; Arun Yadav 
> (aruny) <aruny@cisco.com<mailto:aruny@cisco.com 
> <ma...@cisco.com>>>; Jayaprakash 
> Kuravatti (jkuravat) <jkuravat@cisco.com<mailto:jkuravat@cisco.com 
> <ma...@cisco.com>>>; Avi Sanwal 
> (asanwal) <asanwal@cisco.com<mailto:asanwal@cisco.com 
> <ma...@cisco.com>>>
> > >> Subject: what is efficient way to write Left join in flink
> > >>
> > >> Hi Team,
> > >>
> > >> Getting below exception while using jdbc connector :
> > >>
> > >> Caused by: org.apache.flink.table.api.ValidationException: Could 
> not find any factory for identifier 'jdbc' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> > >>
> > >> Available factory identifiers are:
> > >>
> > >> blackhole
> > >> datagen
> > >> filesystem
> > >> kafka
> > >> print
> > >> upsert-kafka
> > >>
> > >>
> > >> I have already added dependency for jdbc connector in pom.xml as 
> mentioned below:
> > >>
> > >> <dependency>
> > >> <groupId>org.apache.flink</groupId>
> > >> <artifactId>flink-connector-jdbc_2.11</artifactId>
> > >>          <version>1.14.2</version>
> > >> </dependency>
> > >> <dependency>
> > >> <groupId>mysql</groupId>
> > >> <artifactId>mysql-connector-java</artifactId>
> > >>          <version>5.1.41</version>
> > >> </dependency>
> > >>
> > >> Referred release doc link for the same 
> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> > >>
> > >>
> > >>
> > >> Please help me on this and provide the solution for it !!!
> > >>
> > >>
> > >> Thanks
> > >> Ronak Beejawat
> >
> >
>

Re: Could not find any factory for identifier 'jdbc'

Posted by Chesnay Schepler <ch...@apache.org>.
Are you using the maven-jar-plugin to create the jar?

My suspicion is that the META-INF/services are not being properly 
merged. I'd suggest to use the maven-shade-plugin as shown in the 
quickstarts.

On 13/01/2022 05:34, Ronak Beejawat (rbeejawa) wrote:
>
> Hi Roman, Chesnay
>
> PFB screenshot for jdbc connector availability in bundle jar as I 
> mentioned earlier it didn’t worked even than, so I tried putting it in 
> inside flink lib directory as mentioned in below article link then it 
> resolved the issue.
>
>
> @Roman – even I tried with flink-connector-jdbc_2.12 it didn’t worked .
>
> Thanks
>
> Ronak Beejawat
>
> *From: *Roman Khachatryan <ro...@apache.org>
> *Date: *Wednesday, 12 January 2022 at 6:57 PM
> *To: *community@flink.apache.org <co...@flink.apache.org>
> *Cc: *dev <de...@flink.apache.org>, Ronak Beejawat (rbeejawa) 
> <rb...@cisco.com.invalid>, user@flink.apache.org 
> <us...@flink.apache.org>, Hang Ruan <ru...@gmail.com>, Shrinath 
> Shenoy K (sshenoyk) <ss...@cisco.com>, Karthikeyan Muthusamy 
> (karmuthu) <ka...@cisco.com>, Krishna Singitam (ksingita) 
> <ks...@cisco.com>, Arun Yadav (aruny) <ar...@cisco.com>, 
> Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>, Avi Sanwal 
> (asanwal) <as...@cisco.com>
> *Subject: *Re: Could not find any factory for identifier 'jdbc'
>
> Hi,
>
> I think Chesnay's suggestion to double-check the bundle makes sense.
> Additionally, I'd try flink-connector-jdbc_2.12 instead of
> flink-connector-jdbc_2.11.
>
> Regards,
> Roman
>
> On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <ch...@apache.org> 
> wrote:
> >
> > I would try double-checking whether the jdbc connector was truly bundled
> > in your jar, specifically whether
> > org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
> >
> > I can't think of a reason why this shouldn't work for the JDBC 
> connector.
> >
> > On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > > Hi Chesnay,
> > >
> > > How do you ensure that the connector is actually available at runtime?
> > >
> > > We are providing below mentioned dependency inside pom.xml with 
> scope compile that will be available in class path and it was there in 
> my fink job bundled jar. Same we are doing the same for other 
> connector say kafka it worked for that
> > >
> > > <dependency>
> > > <groupId>org.apache.flink</groupId>
> > > <artifactId>flink-connector-jdbc_2.11</artifactId>
> > > <version>1.14.2</version>
> > > </dependency>
> > > <dependency>
> > > <groupId>mysql</groupId>
> > > <artifactId>mysql-connector-java</artifactId>
> > > <version>5.1.41</version>
> > > </dependency>
> > >
> > > Are you bundling it in a jar or putting it into Flinks lib directory?
> > > Yes we are building jar it is bundled with that but still we saw 
> this error . So we tried the workaround which is mentioned in some 
> article to put inside a flink lib directory then it worked 
> https://blog.csdn.net/weixin_44056920/article/details/118110949 . So 
> this is extra stuff which we have to do to make it work with restart 
> of cluster .
> > >
> > > But the question is how it worked for kafka and not for jdbc ? I 
> didn't put kafka jar explicitly in flink lib folder
> > >
> > > Note : I am using flink release 1.14 version for all my job 
> execution / implementation which is a stable version I guess
> > >
> > > Thanks
> > > Ronak Beejawat
> > > From: Chesnay Schepler 
> <chesnay@apache.org<mailto:chesnay@apache.org 
> <ma...@apache.org>>>
> > > Date: Tuesday, 11 January 2022 at 7:45 PM
> > > To: Ronak Beejawat (rbeejawa) 
> <rbeejawa@cisco.com.INVALID<mailto:rbeejawa@cisco.com.INVALID 
> <ma...@cisco.com.INVALID>>>, 
> user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>> 
> <user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>>>
> > > Cc: Hang Ruan 
> <ruanhang1993@gmail.com<mailto:ruanhang1993@gmail.com 
> <ma...@gmail.com>>>, 
> Shrinath Shenoy K (sshenoyk) 
> <sshenoyk@cisco.com<mailto:sshenoyk@cisco.com 
> <ma...@cisco.com>>>, Karthikeyan 
> Muthusamy (karmuthu) <karmuthu@cisco.com<mailto:karmuthu@cisco.com 
> <ma...@cisco.com>>>, Krishna 
> Singitam (ksingita) <ksingita@cisco.com<mailto:ksingita@cisco.com 
> <ma...@cisco.com>>>, Arun Yadav 
> (aruny) <aruny@cisco.com<mailto:aruny@cisco.com 
> <ma...@cisco.com>>>, Jayaprakash 
> Kuravatti (jkuravat) <jkuravat@cisco.com<mailto:jkuravat@cisco.com 
> <ma...@cisco.com>>>, Avi Sanwal 
> (asanwal) <asanwal@cisco.com<mailto:asanwal@cisco.com 
> <ma...@cisco.com>>>
> > > Subject: Re: Could not find any factory for identifier 'jdbc'
> > > How do you ensure that the connector is actually available at runtime?
> > > Are you bundling it in a jar or putting it into Flinks lib directory?
> > >
> > > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> > >> Correcting subject -> Could not find any factory for identifier 
> 'jdbc'
> > >>
> > >> From: Ronak Beejawat (rbeejawa)
> > >> Sent: Tuesday, January 11, 2022 6:43 PM
> > >> To: 'dev@flink.apache.org' 
> <dev@flink.apache.org<mailto:dev@flink.apache.org 
> <ma...@flink.apache.org>>>; 
> 'community@flink.apache.org' 
> <community@flink.apache.org<mailto:community@flink.apache.org 
> <ma...@flink.apache.org>>>; 
> 'user@flink.apache.org' 
> <user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>>>
> > >> Cc: 'Hang Ruan' 
> <ruanhang1993@gmail.com<mailto:ruanhang1993@gmail.com 
> <ma...@gmail.com>>>; 
> Shrinath Shenoy K (sshenoyk) 
> <sshenoyk@cisco.com<mailto:sshenoyk@cisco.com 
> <ma...@cisco.com>>>; Karthikeyan 
> Muthusamy (karmuthu) <karmuthu@cisco.com<mailto:karmuthu@cisco.com 
> <ma...@cisco.com>>>; Krishna 
> Singitam (ksingita) <ksingita@cisco.com<mailto:ksingita@cisco.com 
> <ma...@cisco.com>>>; Arun Yadav 
> (aruny) <aruny@cisco.com<mailto:aruny@cisco.com 
> <ma...@cisco.com>>>; Jayaprakash 
> Kuravatti (jkuravat) <jkuravat@cisco.com<mailto:jkuravat@cisco.com 
> <ma...@cisco.com>>>; Avi Sanwal 
> (asanwal) <asanwal@cisco.com<mailto:asanwal@cisco.com 
> <ma...@cisco.com>>>
> > >> Subject: what is efficient way to write Left join in flink
> > >>
> > >> Hi Team,
> > >>
> > >> Getting below exception while using jdbc connector :
> > >>
> > >> Caused by: org.apache.flink.table.api.ValidationException: Could 
> not find any factory for identifier 'jdbc' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> > >>
> > >> Available factory identifiers are:
> > >>
> > >> blackhole
> > >> datagen
> > >> filesystem
> > >> kafka
> > >> print
> > >> upsert-kafka
> > >>
> > >>
> > >> I have already added dependency for jdbc connector in pom.xml as 
> mentioned below:
> > >>
> > >> <dependency>
> > >> <groupId>org.apache.flink</groupId>
> > >> <artifactId>flink-connector-jdbc_2.11</artifactId>
> > >>          <version>1.14.2</version>
> > >> </dependency>
> > >> <dependency>
> > >> <groupId>mysql</groupId>
> > >> <artifactId>mysql-connector-java</artifactId>
> > >>          <version>5.1.41</version>
> > >> </dependency>
> > >>
> > >> Referred release doc link for the same 
> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> > >>
> > >>
> > >>
> > >> Please help me on this and provide the solution for it !!!
> > >>
> > >>
> > >> Thanks
> > >> Ronak Beejawat
> >
> >
>

Re: Could not find any factory for identifier 'jdbc'

Posted by Chesnay Schepler <ch...@apache.org>.
Are you using the maven-jar-plugin to create the jar?

My suspicion is that the META-INF/services are not being properly 
merged. I'd suggest to use the maven-shade-plugin as shown in the 
quickstarts.

On 13/01/2022 05:34, Ronak Beejawat (rbeejawa) wrote:
>
> Hi Roman, Chesnay
>
> PFB screenshot for jdbc connector availability in bundle jar as I 
> mentioned earlier it didn’t worked even than, so I tried putting it in 
> inside flink lib directory as mentioned in below article link then it 
> resolved the issue.
>
>
> @Roman – even I tried with flink-connector-jdbc_2.12 it didn’t worked .
>
> Thanks
>
> Ronak Beejawat
>
> *From: *Roman Khachatryan <ro...@apache.org>
> *Date: *Wednesday, 12 January 2022 at 6:57 PM
> *To: *community@flink.apache.org <co...@flink.apache.org>
> *Cc: *dev <de...@flink.apache.org>, Ronak Beejawat (rbeejawa) 
> <rb...@cisco.com.invalid>, user@flink.apache.org 
> <us...@flink.apache.org>, Hang Ruan <ru...@gmail.com>, Shrinath 
> Shenoy K (sshenoyk) <ss...@cisco.com>, Karthikeyan Muthusamy 
> (karmuthu) <ka...@cisco.com>, Krishna Singitam (ksingita) 
> <ks...@cisco.com>, Arun Yadav (aruny) <ar...@cisco.com>, 
> Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>, Avi Sanwal 
> (asanwal) <as...@cisco.com>
> *Subject: *Re: Could not find any factory for identifier 'jdbc'
>
> Hi,
>
> I think Chesnay's suggestion to double-check the bundle makes sense.
> Additionally, I'd try flink-connector-jdbc_2.12 instead of
> flink-connector-jdbc_2.11.
>
> Regards,
> Roman
>
> On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <ch...@apache.org> 
> wrote:
> >
> > I would try double-checking whether the jdbc connector was truly bundled
> > in your jar, specifically whether
> > org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
> >
> > I can't think of a reason why this shouldn't work for the JDBC 
> connector.
> >
> > On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > > Hi Chesnay,
> > >
> > > How do you ensure that the connector is actually available at runtime?
> > >
> > > We are providing below mentioned dependency inside pom.xml with 
> scope compile that will be available in class path and it was there in 
> my fink job bundled jar. Same we are doing the same for other 
> connector say kafka it worked for that
> > >
> > > <dependency>
> > > <groupId>org.apache.flink</groupId>
> > > <artifactId>flink-connector-jdbc_2.11</artifactId>
> > > <version>1.14.2</version>
> > > </dependency>
> > > <dependency>
> > > <groupId>mysql</groupId>
> > > <artifactId>mysql-connector-java</artifactId>
> > > <version>5.1.41</version>
> > > </dependency>
> > >
> > > Are you bundling it in a jar or putting it into Flinks lib directory?
> > > Yes we are building jar it is bundled with that but still we saw 
> this error . So we tried the workaround which is mentioned in some 
> article to put inside a flink lib directory then it worked 
> https://blog.csdn.net/weixin_44056920/article/details/118110949 . So 
> this is extra stuff which we have to do to make it work with restart 
> of cluster .
> > >
> > > But the question is how it worked for kafka and not for jdbc ? I 
> didn't put kafka jar explicitly in flink lib folder
> > >
> > > Note : I am using flink release 1.14 version for all my job 
> execution / implementation which is a stable version I guess
> > >
> > > Thanks
> > > Ronak Beejawat
> > > From: Chesnay Schepler 
> <chesnay@apache.org<mailto:chesnay@apache.org 
> <ma...@apache.org>>>
> > > Date: Tuesday, 11 January 2022 at 7:45 PM
> > > To: Ronak Beejawat (rbeejawa) 
> <rbeejawa@cisco.com.INVALID<mailto:rbeejawa@cisco.com.INVALID 
> <ma...@cisco.com.INVALID>>>, 
> user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>> 
> <user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>>>
> > > Cc: Hang Ruan 
> <ruanhang1993@gmail.com<mailto:ruanhang1993@gmail.com 
> <ma...@gmail.com>>>, 
> Shrinath Shenoy K (sshenoyk) 
> <sshenoyk@cisco.com<mailto:sshenoyk@cisco.com 
> <ma...@cisco.com>>>, Karthikeyan 
> Muthusamy (karmuthu) <karmuthu@cisco.com<mailto:karmuthu@cisco.com 
> <ma...@cisco.com>>>, Krishna 
> Singitam (ksingita) <ksingita@cisco.com<mailto:ksingita@cisco.com 
> <ma...@cisco.com>>>, Arun Yadav 
> (aruny) <aruny@cisco.com<mailto:aruny@cisco.com 
> <ma...@cisco.com>>>, Jayaprakash 
> Kuravatti (jkuravat) <jkuravat@cisco.com<mailto:jkuravat@cisco.com 
> <ma...@cisco.com>>>, Avi Sanwal 
> (asanwal) <asanwal@cisco.com<mailto:asanwal@cisco.com 
> <ma...@cisco.com>>>
> > > Subject: Re: Could not find any factory for identifier 'jdbc'
> > > How do you ensure that the connector is actually available at runtime?
> > > Are you bundling it in a jar or putting it into Flinks lib directory?
> > >
> > > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> > >> Correcting subject -> Could not find any factory for identifier 
> 'jdbc'
> > >>
> > >> From: Ronak Beejawat (rbeejawa)
> > >> Sent: Tuesday, January 11, 2022 6:43 PM
> > >> To: 'dev@flink.apache.org' 
> <dev@flink.apache.org<mailto:dev@flink.apache.org 
> <ma...@flink.apache.org>>>; 
> 'community@flink.apache.org' 
> <community@flink.apache.org<mailto:community@flink.apache.org 
> <ma...@flink.apache.org>>>; 
> 'user@flink.apache.org' 
> <user@flink.apache.org<mailto:user@flink.apache.org 
> <ma...@flink.apache.org>>>
> > >> Cc: 'Hang Ruan' 
> <ruanhang1993@gmail.com<mailto:ruanhang1993@gmail.com 
> <ma...@gmail.com>>>; 
> Shrinath Shenoy K (sshenoyk) 
> <sshenoyk@cisco.com<mailto:sshenoyk@cisco.com 
> <ma...@cisco.com>>>; Karthikeyan 
> Muthusamy (karmuthu) <karmuthu@cisco.com<mailto:karmuthu@cisco.com 
> <ma...@cisco.com>>>; Krishna 
> Singitam (ksingita) <ksingita@cisco.com<mailto:ksingita@cisco.com 
> <ma...@cisco.com>>>; Arun Yadav 
> (aruny) <aruny@cisco.com<mailto:aruny@cisco.com 
> <ma...@cisco.com>>>; Jayaprakash 
> Kuravatti (jkuravat) <jkuravat@cisco.com<mailto:jkuravat@cisco.com 
> <ma...@cisco.com>>>; Avi Sanwal 
> (asanwal) <asanwal@cisco.com<mailto:asanwal@cisco.com 
> <ma...@cisco.com>>>
> > >> Subject: what is efficient way to write Left join in flink
> > >>
> > >> Hi Team,
> > >>
> > >> Getting below exception while using jdbc connector :
> > >>
> > >> Caused by: org.apache.flink.table.api.ValidationException: Could 
> not find any factory for identifier 'jdbc' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> > >>
> > >> Available factory identifiers are:
> > >>
> > >> blackhole
> > >> datagen
> > >> filesystem
> > >> kafka
> > >> print
> > >> upsert-kafka
> > >>
> > >>
> > >> I have already added dependency for jdbc connector in pom.xml as 
> mentioned below:
> > >>
> > >> <dependency>
> > >> <groupId>org.apache.flink</groupId>
> > >> <artifactId>flink-connector-jdbc_2.11</artifactId>
> > >>          <version>1.14.2</version>
> > >> </dependency>
> > >> <dependency>
> > >> <groupId>mysql</groupId>
> > >> <artifactId>mysql-connector-java</artifactId>
> > >>          <version>5.1.41</version>
> > >> </dependency>
> > >>
> > >> Referred release doc link for the same 
> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> > >>
> > >>
> > >>
> > >> Please help me on this and provide the solution for it !!!
> > >>
> > >>
> > >> Thanks
> > >> Ronak Beejawat
> >
> >
>

RE: Could not find any factory for identifier 'jdbc'

Posted by "Ronak Beejawat (rbeejawa)" <rb...@cisco.com.INVALID>.
Hi Roman, Chesnay

PFB screenshot for jdbc connector availability in bundle jar as I mentioned earlier it didn't worked even than, so I tried putting it in inside flink lib directory as mentioned in below article link then it resolved the issue.



[cid:image001.png@01D80864.522974B0]


[cid:image002.png@01D80864.522974B0]
@Roman - even I tried with flink-connector-jdbc_2.12 it didn't worked .

Thanks
Ronak Beejawat
From: Roman Khachatryan <ro...@apache.org>>
Date: Wednesday, 12 January 2022 at 6:57 PM
To: community@flink.apache.org<ma...@flink.apache.org> <co...@flink.apache.org>>
Cc: dev <de...@flink.apache.org>>, Ronak Beejawat (rbeejawa) <rb...@cisco.com.invalid>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>, Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
Subject: Re: Could not find any factory for identifier 'jdbc'
Hi,

I think Chesnay's suggestion to double-check the bundle makes sense.
Additionally, I'd try flink-connector-jdbc_2.12 instead of
flink-connector-jdbc_2.11.

Regards,
Roman

On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <ch...@apache.org>> wrote:
>
> I would try double-checking whether the jdbc connector was truly bundled
> in your jar, specifically whether
> org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
>
> I can't think of a reason why this shouldn't work for the JDBC connector.
>
> On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > Hi Chesnay,
> >
> > How do you ensure that the connector is actually available at runtime?
> >
> > We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that
> >
> > <dependency>
> >                <groupId>org.apache.flink</groupId>
> >                <artifactId>flink-connector-jdbc_2.11</artifactId>
> >                <version>1.14.2</version>
> > </dependency>
> > <dependency>
> >                <groupId>mysql</groupId>
> >                <artifactId>mysql-connector-java</artifactId>
> >                <version>5.1.41</version>
> > </dependency>
> >
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> > Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .
> >
> > But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder
> >
> > Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess
> >
> > Thanks
> > Ronak Beejawat
> > From: Chesnay Schepler <ch...@apache.org>>>
> > Date: Tuesday, 11 January 2022 at 7:45 PM
> > To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>>, user@flink.apache.org<ma...@flink.apache.org>> <us...@flink.apache.org>>>
> > Cc: Hang Ruan <ru...@gmail.com>>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>>, Krishna Singitam (ksingita) <ks...@cisco.com>>>, Arun Yadav (aruny) <ar...@cisco.com>>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>>, Avi Sanwal (asanwal) <as...@cisco.com>>>
> > Subject: Re: Could not find any factory for identifier 'jdbc'
> > How do you ensure that the connector is actually available at runtime?
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> >
> > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> >> Correcting subject -> Could not find any factory for identifier 'jdbc'
> >>
> >> From: Ronak Beejawat (rbeejawa)
> >> Sent: Tuesday, January 11, 2022 6:43 PM
> >> To: 'dev@flink.apache.org' <de...@flink.apache.org>>>; 'community@flink.apache.org' <co...@flink.apache.org>>>; 'user@flink.apache.org' <us...@flink.apache.org>>>
> >> Cc: 'Hang Ruan' <ru...@gmail.com>>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>>; Krishna Singitam (ksingita) <ks...@cisco.com>>>; Arun Yadav (aruny) <ar...@cisco.com>>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>>; Avi Sanwal (asanwal) <as...@cisco.com>>>
> >> Subject: what is efficient way to write Left join in flink
> >>
> >> Hi Team,
> >>
> >> Getting below exception while using jdbc connector :
> >>
> >> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> >>
> >> Available factory identifiers are:
> >>
> >> blackhole
> >> datagen
> >> filesystem
> >> kafka
> >> print
> >> upsert-kafka
> >>
> >>
> >> I have already added dependency for jdbc connector in pom.xml as mentioned below:
> >>
> >> <dependency>
> >> <groupId>org.apache.flink</groupId>
> >>          <artifactId>flink-connector-jdbc_2.11</artifactId>
> >>          <version>1.14.2</version>
> >> </dependency>
> >> <dependency>
> >> <groupId>mysql</groupId>
> >>          <artifactId>mysql-connector-java</artifactId>
> >>          <version>5.1.41</version>
> >> </dependency>
> >>
> >> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> >>
> >>
> >>
> >> Please help me on this and provide the solution for it !!!
> >>
> >>
> >> Thanks
> >> Ronak Beejawat
>
>

Re: Could not find any factory for identifier 'jdbc'

Posted by Roman Khachatryan <ro...@apache.org>.
Hi,

I think Chesnay's suggestion to double-check the bundle makes sense.
Additionally, I'd try flink-connector-jdbc_2.12 instead of
flink-connector-jdbc_2.11.

Regards,
Roman

On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <ch...@apache.org> wrote:
>
> I would try double-checking whether the jdbc connector was truly bundled
> in your jar, specifically whether
> org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
>
> I can't think of a reason why this shouldn't work for the JDBC connector.
>
> On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > Hi Chesnay,
> >
> > How do you ensure that the connector is actually available at runtime?
> >
> > We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that
> >
> > <dependency>
> >                <groupId>org.apache.flink</groupId>
> >                <artifactId>flink-connector-jdbc_2.11</artifactId>
> >                <version>1.14.2</version>
> > </dependency>
> > <dependency>
> >                <groupId>mysql</groupId>
> >                <artifactId>mysql-connector-java</artifactId>
> >                <version>5.1.41</version>
> > </dependency>
> >
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> > Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .
> >
> > But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder
> >
> > Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess
> >
> > Thanks
> > Ronak Beejawat
> > From: Chesnay Schepler <ch...@apache.org>>
> > Date: Tuesday, 11 January 2022 at 7:45 PM
> > To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>
> > Cc: Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
> > Subject: Re: Could not find any factory for identifier 'jdbc'
> > How do you ensure that the connector is actually available at runtime?
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> >
> > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> >> Correcting subject -> Could not find any factory for identifier 'jdbc'
> >>
> >> From: Ronak Beejawat (rbeejawa)
> >> Sent: Tuesday, January 11, 2022 6:43 PM
> >> To: 'dev@flink.apache.org' <de...@flink.apache.org>>; 'community@flink.apache.org' <co...@flink.apache.org>>; 'user@flink.apache.org' <us...@flink.apache.org>>
> >> Cc: 'Hang Ruan' <ru...@gmail.com>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>; Krishna Singitam (ksingita) <ks...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>; Avi Sanwal (asanwal) <as...@cisco.com>>
> >> Subject: what is efficient way to write Left join in flink
> >>
> >> Hi Team,
> >>
> >> Getting below exception while using jdbc connector :
> >>
> >> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> >>
> >> Available factory identifiers are:
> >>
> >> blackhole
> >> datagen
> >> filesystem
> >> kafka
> >> print
> >> upsert-kafka
> >>
> >>
> >> I have already added dependency for jdbc connector in pom.xml as mentioned below:
> >>
> >> <dependency>
> >> <groupId>org.apache.flink</groupId>
> >>          <artifactId>flink-connector-jdbc_2.11</artifactId>
> >>          <version>1.14.2</version>
> >> </dependency>
> >> <dependency>
> >> <groupId>mysql</groupId>
> >>          <artifactId>mysql-connector-java</artifactId>
> >>          <version>5.1.41</version>
> >> </dependency>
> >>
> >> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> >>
> >>
> >>
> >> Please help me on this and provide the solution for it !!!
> >>
> >>
> >> Thanks
> >> Ronak Beejawat
>
>

Re: Could not find any factory for identifier 'jdbc'

Posted by Roman Khachatryan <ro...@apache.org>.
Hi,

I think Chesnay's suggestion to double-check the bundle makes sense.
Additionally, I'd try flink-connector-jdbc_2.12 instead of
flink-connector-jdbc_2.11.

Regards,
Roman

On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <ch...@apache.org> wrote:
>
> I would try double-checking whether the jdbc connector was truly bundled
> in your jar, specifically whether
> org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
>
> I can't think of a reason why this shouldn't work for the JDBC connector.
>
> On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > Hi Chesnay,
> >
> > How do you ensure that the connector is actually available at runtime?
> >
> > We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that
> >
> > <dependency>
> >                <groupId>org.apache.flink</groupId>
> >                <artifactId>flink-connector-jdbc_2.11</artifactId>
> >                <version>1.14.2</version>
> > </dependency>
> > <dependency>
> >                <groupId>mysql</groupId>
> >                <artifactId>mysql-connector-java</artifactId>
> >                <version>5.1.41</version>
> > </dependency>
> >
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> > Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .
> >
> > But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder
> >
> > Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess
> >
> > Thanks
> > Ronak Beejawat
> > From: Chesnay Schepler <ch...@apache.org>>
> > Date: Tuesday, 11 January 2022 at 7:45 PM
> > To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>
> > Cc: Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
> > Subject: Re: Could not find any factory for identifier 'jdbc'
> > How do you ensure that the connector is actually available at runtime?
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> >
> > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> >> Correcting subject -> Could not find any factory for identifier 'jdbc'
> >>
> >> From: Ronak Beejawat (rbeejawa)
> >> Sent: Tuesday, January 11, 2022 6:43 PM
> >> To: 'dev@flink.apache.org' <de...@flink.apache.org>>; 'community@flink.apache.org' <co...@flink.apache.org>>; 'user@flink.apache.org' <us...@flink.apache.org>>
> >> Cc: 'Hang Ruan' <ru...@gmail.com>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>; Krishna Singitam (ksingita) <ks...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>; Avi Sanwal (asanwal) <as...@cisco.com>>
> >> Subject: what is efficient way to write Left join in flink
> >>
> >> Hi Team,
> >>
> >> Getting below exception while using jdbc connector :
> >>
> >> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> >>
> >> Available factory identifiers are:
> >>
> >> blackhole
> >> datagen
> >> filesystem
> >> kafka
> >> print
> >> upsert-kafka
> >>
> >>
> >> I have already added dependency for jdbc connector in pom.xml as mentioned below:
> >>
> >> <dependency>
> >> <groupId>org.apache.flink</groupId>
> >>          <artifactId>flink-connector-jdbc_2.11</artifactId>
> >>          <version>1.14.2</version>
> >> </dependency>
> >> <dependency>
> >> <groupId>mysql</groupId>
> >>          <artifactId>mysql-connector-java</artifactId>
> >>          <version>5.1.41</version>
> >> </dependency>
> >>
> >> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> >>
> >>
> >>
> >> Please help me on this and provide the solution for it !!!
> >>
> >>
> >> Thanks
> >> Ronak Beejawat
>
>

Re: Could not find any factory for identifier 'jdbc'

Posted by Roman Khachatryan <ro...@apache.org>.
Hi,

I think Chesnay's suggestion to double-check the bundle makes sense.
Additionally, I'd try flink-connector-jdbc_2.12 instead of
flink-connector-jdbc_2.11.

Regards,
Roman

On Wed, Jan 12, 2022 at 12:23 PM Chesnay Schepler <ch...@apache.org> wrote:
>
> I would try double-checking whether the jdbc connector was truly bundled
> in your jar, specifically whether
> org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
>
> I can't think of a reason why this shouldn't work for the JDBC connector.
>
> On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> > Hi Chesnay,
> >
> > How do you ensure that the connector is actually available at runtime?
> >
> > We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that
> >
> > <dependency>
> >                <groupId>org.apache.flink</groupId>
> >                <artifactId>flink-connector-jdbc_2.11</artifactId>
> >                <version>1.14.2</version>
> > </dependency>
> > <dependency>
> >                <groupId>mysql</groupId>
> >                <artifactId>mysql-connector-java</artifactId>
> >                <version>5.1.41</version>
> > </dependency>
> >
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> > Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .
> >
> > But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder
> >
> > Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess
> >
> > Thanks
> > Ronak Beejawat
> > From: Chesnay Schepler <ch...@apache.org>>
> > Date: Tuesday, 11 January 2022 at 7:45 PM
> > To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>
> > Cc: Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
> > Subject: Re: Could not find any factory for identifier 'jdbc'
> > How do you ensure that the connector is actually available at runtime?
> > Are you bundling it in a jar or putting it into Flinks lib directory?
> >
> > On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> >> Correcting subject -> Could not find any factory for identifier 'jdbc'
> >>
> >> From: Ronak Beejawat (rbeejawa)
> >> Sent: Tuesday, January 11, 2022 6:43 PM
> >> To: 'dev@flink.apache.org' <de...@flink.apache.org>>; 'community@flink.apache.org' <co...@flink.apache.org>>; 'user@flink.apache.org' <us...@flink.apache.org>>
> >> Cc: 'Hang Ruan' <ru...@gmail.com>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>; Krishna Singitam (ksingita) <ks...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>; Avi Sanwal (asanwal) <as...@cisco.com>>
> >> Subject: what is efficient way to write Left join in flink
> >>
> >> Hi Team,
> >>
> >> Getting below exception while using jdbc connector :
> >>
> >> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> >>
> >> Available factory identifiers are:
> >>
> >> blackhole
> >> datagen
> >> filesystem
> >> kafka
> >> print
> >> upsert-kafka
> >>
> >>
> >> I have already added dependency for jdbc connector in pom.xml as mentioned below:
> >>
> >> <dependency>
> >> <groupId>org.apache.flink</groupId>
> >>          <artifactId>flink-connector-jdbc_2.11</artifactId>
> >>          <version>1.14.2</version>
> >> </dependency>
> >> <dependency>
> >> <groupId>mysql</groupId>
> >>          <artifactId>mysql-connector-java</artifactId>
> >>          <version>5.1.41</version>
> >> </dependency>
> >>
> >> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
> >>
> >>
> >>
> >> Please help me on this and provide the solution for it !!!
> >>
> >>
> >> Thanks
> >> Ronak Beejawat
>
>

Re: Could not find any factory for identifier 'jdbc'

Posted by Chesnay Schepler <ch...@apache.org>.
I would try double-checking whether the jdbc connector was truly bundled 
in your jar, specifically whether 
org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.

I can't think of a reason why this shouldn't work for the JDBC connector.

On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> Hi Chesnay,
>
> How do you ensure that the connector is actually available at runtime?
>
> We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that
>
> <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-connector-jdbc_2.11</artifactId>
>                <version>1.14.2</version>
> </dependency>
> <dependency>
>                <groupId>mysql</groupId>
>                <artifactId>mysql-connector-java</artifactId>
>                <version>5.1.41</version>
> </dependency>
>
> Are you bundling it in a jar or putting it into Flinks lib directory?
> Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .
>
> But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder
>
> Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess
>
> Thanks
> Ronak Beejawat
> From: Chesnay Schepler <ch...@apache.org>>
> Date: Tuesday, 11 January 2022 at 7:45 PM
> To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>
> Cc: Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
> Subject: Re: Could not find any factory for identifier 'jdbc'
> How do you ensure that the connector is actually available at runtime?
> Are you bundling it in a jar or putting it into Flinks lib directory?
>
> On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
>> Correcting subject -> Could not find any factory for identifier 'jdbc'
>>
>> From: Ronak Beejawat (rbeejawa)
>> Sent: Tuesday, January 11, 2022 6:43 PM
>> To: 'dev@flink.apache.org' <de...@flink.apache.org>>; 'community@flink.apache.org' <co...@flink.apache.org>>; 'user@flink.apache.org' <us...@flink.apache.org>>
>> Cc: 'Hang Ruan' <ru...@gmail.com>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>; Krishna Singitam (ksingita) <ks...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>; Avi Sanwal (asanwal) <as...@cisco.com>>
>> Subject: what is efficient way to write Left join in flink
>>
>> Hi Team,
>>
>> Getting below exception while using jdbc connector :
>>
>> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
>>
>> Available factory identifiers are:
>>
>> blackhole
>> datagen
>> filesystem
>> kafka
>> print
>> upsert-kafka
>>
>>
>> I have already added dependency for jdbc connector in pom.xml as mentioned below:
>>
>> <dependency>
>> <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-connector-jdbc_2.11</artifactId>
>>          <version>1.14.2</version>
>> </dependency>
>> <dependency>
>> <groupId>mysql</groupId>
>>          <artifactId>mysql-connector-java</artifactId>
>>          <version>5.1.41</version>
>> </dependency>
>>
>> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
>>
>>
>>
>> Please help me on this and provide the solution for it !!!
>>
>>
>> Thanks
>> Ronak Beejawat



Re: Could not find any factory for identifier 'jdbc'

Posted by Chesnay Schepler <ch...@apache.org>.
I would try double-checking whether the jdbc connector was truly bundled 
in your jar, specifically whether 
org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.

I can't think of a reason why this shouldn't work for the JDBC connector.

On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> Hi Chesnay,
>
> How do you ensure that the connector is actually available at runtime?
>
> We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that
>
> <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-connector-jdbc_2.11</artifactId>
>                <version>1.14.2</version>
> </dependency>
> <dependency>
>                <groupId>mysql</groupId>
>                <artifactId>mysql-connector-java</artifactId>
>                <version>5.1.41</version>
> </dependency>
>
> Are you bundling it in a jar or putting it into Flinks lib directory?
> Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .
>
> But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder
>
> Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess
>
> Thanks
> Ronak Beejawat
> From: Chesnay Schepler <ch...@apache.org>>
> Date: Tuesday, 11 January 2022 at 7:45 PM
> To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>
> Cc: Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
> Subject: Re: Could not find any factory for identifier 'jdbc'
> How do you ensure that the connector is actually available at runtime?
> Are you bundling it in a jar or putting it into Flinks lib directory?
>
> On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
>> Correcting subject -> Could not find any factory for identifier 'jdbc'
>>
>> From: Ronak Beejawat (rbeejawa)
>> Sent: Tuesday, January 11, 2022 6:43 PM
>> To: 'dev@flink.apache.org' <de...@flink.apache.org>>; 'community@flink.apache.org' <co...@flink.apache.org>>; 'user@flink.apache.org' <us...@flink.apache.org>>
>> Cc: 'Hang Ruan' <ru...@gmail.com>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>; Krishna Singitam (ksingita) <ks...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>; Avi Sanwal (asanwal) <as...@cisco.com>>
>> Subject: what is efficient way to write Left join in flink
>>
>> Hi Team,
>>
>> Getting below exception while using jdbc connector :
>>
>> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
>>
>> Available factory identifiers are:
>>
>> blackhole
>> datagen
>> filesystem
>> kafka
>> print
>> upsert-kafka
>>
>>
>> I have already added dependency for jdbc connector in pom.xml as mentioned below:
>>
>> <dependency>
>> <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-connector-jdbc_2.11</artifactId>
>>          <version>1.14.2</version>
>> </dependency>
>> <dependency>
>> <groupId>mysql</groupId>
>>          <artifactId>mysql-connector-java</artifactId>
>>          <version>5.1.41</version>
>> </dependency>
>>
>> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
>>
>>
>>
>> Please help me on this and provide the solution for it !!!
>>
>>
>> Thanks
>> Ronak Beejawat



Re: Could not find any factory for identifier 'jdbc'

Posted by Chesnay Schepler <ch...@apache.org>.
I would try double-checking whether the jdbc connector was truly bundled 
in your jar, specifically whether 
org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.

I can't think of a reason why this shouldn't work for the JDBC connector.

On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
> Hi Chesnay,
>
> How do you ensure that the connector is actually available at runtime?
>
> We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that
>
> <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-connector-jdbc_2.11</artifactId>
>                <version>1.14.2</version>
> </dependency>
> <dependency>
>                <groupId>mysql</groupId>
>                <artifactId>mysql-connector-java</artifactId>
>                <version>5.1.41</version>
> </dependency>
>
> Are you bundling it in a jar or putting it into Flinks lib directory?
> Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .
>
> But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder
>
> Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess
>
> Thanks
> Ronak Beejawat
> From: Chesnay Schepler <ch...@apache.org>>
> Date: Tuesday, 11 January 2022 at 7:45 PM
> To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>
> Cc: Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
> Subject: Re: Could not find any factory for identifier 'jdbc'
> How do you ensure that the connector is actually available at runtime?
> Are you bundling it in a jar or putting it into Flinks lib directory?
>
> On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
>> Correcting subject -> Could not find any factory for identifier 'jdbc'
>>
>> From: Ronak Beejawat (rbeejawa)
>> Sent: Tuesday, January 11, 2022 6:43 PM
>> To: 'dev@flink.apache.org' <de...@flink.apache.org>>; 'community@flink.apache.org' <co...@flink.apache.org>>; 'user@flink.apache.org' <us...@flink.apache.org>>
>> Cc: 'Hang Ruan' <ru...@gmail.com>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>; Krishna Singitam (ksingita) <ks...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>; Avi Sanwal (asanwal) <as...@cisco.com>>
>> Subject: what is efficient way to write Left join in flink
>>
>> Hi Team,
>>
>> Getting below exception while using jdbc connector :
>>
>> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
>>
>> Available factory identifiers are:
>>
>> blackhole
>> datagen
>> filesystem
>> kafka
>> print
>> upsert-kafka
>>
>>
>> I have already added dependency for jdbc connector in pom.xml as mentioned below:
>>
>> <dependency>
>> <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-connector-jdbc_2.11</artifactId>
>>          <version>1.14.2</version>
>> </dependency>
>> <dependency>
>> <groupId>mysql</groupId>
>>          <artifactId>mysql-connector-java</artifactId>
>>          <version>5.1.41</version>
>> </dependency>
>>
>> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
>>
>>
>>
>> Please help me on this and provide the solution for it !!!
>>
>>
>> Thanks
>> Ronak Beejawat



RE: Could not find any factory for identifier 'jdbc'

Posted by "Ronak Beejawat (rbeejawa)" <rb...@cisco.com.INVALID>.
Hi Chesnay,

How do you ensure that the connector is actually available at runtime?

We are providing below mentioned dependency inside pom.xml with scope compile that will be available in class path and it was there in my fink job bundled jar. Same we are doing the same for other connector say kafka it worked for that

<dependency>
              <groupId>org.apache.flink</groupId>
              <artifactId>flink-connector-jdbc_2.11</artifactId>
              <version>1.14.2</version>
</dependency>
<dependency>
              <groupId>mysql</groupId>
              <artifactId>mysql-connector-java</artifactId>
              <version>5.1.41</version>
</dependency>

Are you bundling it in a jar or putting it into Flinks lib directory?
Yes we are building jar it is bundled with that but still we saw this error . So we tried the workaround which is mentioned in some article to put inside a flink lib directory then it worked https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is extra stuff which we have to do to make it work with restart of cluster .

But the question is how it worked for kafka and not for jdbc ? I didn't put kafka jar explicitly in flink lib folder

Note : I am using flink release 1.14 version for all my job execution / implementation which is a stable version I guess

Thanks
Ronak Beejawat
From: Chesnay Schepler <ch...@apache.org>>
Date: Tuesday, 11 January 2022 at 7:45 PM
To: Ronak Beejawat (rbeejawa) <rb...@cisco.com.INVALID>>, user@flink.apache.org<ma...@flink.apache.org> <us...@flink.apache.org>>
Cc: Hang Ruan <ru...@gmail.com>>, Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>, Krishna Singitam (ksingita) <ks...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com>>, Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>, Avi Sanwal (asanwal) <as...@cisco.com>>
Subject: Re: Could not find any factory for identifier 'jdbc'
How do you ensure that the connector is actually available at runtime?
Are you bundling it in a jar or putting it into Flinks lib directory?

On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
> Correcting subject -> Could not find any factory for identifier 'jdbc'
>
> From: Ronak Beejawat (rbeejawa)
> Sent: Tuesday, January 11, 2022 6:43 PM
> To: 'dev@flink.apache.org' <de...@flink.apache.org>>; 'community@flink.apache.org' <co...@flink.apache.org>>; 'user@flink.apache.org' <us...@flink.apache.org>>
> Cc: 'Hang Ruan' <ru...@gmail.com>>; Shrinath Shenoy K (sshenoyk) <ss...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <ka...@cisco.com>>; Krishna Singitam (ksingita) <ks...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com>>; Jayaprakash Kuravatti (jkuravat) <jk...@cisco.com>>; Avi Sanwal (asanwal) <as...@cisco.com>>
> Subject: what is efficient way to write Left join in flink
>
> Hi Team,
>
> Getting below exception while using jdbc connector :
>
> Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'jdbc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
>
> Available factory identifiers are:
>
> blackhole
> datagen
> filesystem
> kafka
> print
> upsert-kafka
>
>
> I have already added dependency for jdbc connector in pom.xml as mentioned below:
>
> <dependency>
> <groupId>org.apache.flink</groupId>
>         <artifactId>flink-connector-jdbc_2.11</artifactId>
>         <version>1.14.2</version>
> </dependency>
> <dependency>
> <groupId>mysql</groupId>
>         <artifactId>mysql-connector-java</artifactId>
>         <version>5.1.41</version>
> </dependency>
>
> Referred release doc link for the same https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
>
>
>
> Please help me on this and provide the solution for it !!!
>
>
> Thanks
> Ronak Beejawat