You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@streampark.apache.org by "SbloodyS (via GitHub)" <gi...@apache.org> on 2023/10/24 03:08:17 UTC

[I] [Bug] Bug title [incubator-streampark]

SbloodyS opened a new issue, #3275:
URL: https://github.com/apache/incubator-streampark/issues/3275

   ### Search before asking
   
   - [X] I had searched in the [issues](https://github.com/apache/incubator-streampark/issues?q=is%3Aissue+label%3A%22bug%22) and found no similar issues.
   
   
   ### Java Version
   
   1.8
   
   ### Scala Version
   
   2.11.x
   
   ### StreamPark Version
   
   2.1.1
   
   ### Flink Version
   
   1.16.2
   
   ### deploy mode
   
   None
   
   ### What happened
   
   Using `flink-sql-connector-mysql-cdc-2.4.1` in custom code mode. I have verified that using the flink run command on the same server can run normally. However, the following error was reported when starting in streampark.
   
   `
   java.util.concurrent.CompletionException: java.lang.reflect.InvocationTargetException
   	at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
   	at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
   	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1592)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
   	at java.lang.Thread.run(Thread.java:745)
   Caused by: java.lang.reflect.InvocationTargetException
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at org.apache.streampark.flink.client.FlinkClient$.$anonfun$proxy$1(FlinkClient.scala:80)
   	at org.apache.streampark.flink.proxy.FlinkShimsProxy$.$anonfun$proxy$1(FlinkShimsProxy.scala:60)
   	at org.apache.streampark.common.util.ClassLoaderUtils$.runAsClassLoader(ClassLoaderUtils.scala:38)
   	at org.apache.streampark.flink.proxy.FlinkShimsProxy$.proxy(FlinkShimsProxy.scala:60)
   	at org.apache.streampark.flink.client.FlinkClient$.proxy(FlinkClient.scala:75)
   	at org.apache.streampark.flink.client.FlinkClient$.submit(FlinkClient.scala:49)
   	at org.apache.streampark.flink.client.FlinkClient.submit(FlinkClient.scala)
   	at org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.lambda$start$10(ApplicationServiceImpl.java:1544)
   	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
   	... 3 more
   Caused by: org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Unable to create a source for reading table 'default_catalog.default_database.source_mysqlcdc_t_notify_log'.
   
   Table options are:
   
   'connector'='mysql-cdc'
   'database-name'='yyls_chat'
   'hostname'='172.16.204.59'
   'password'='******'
   'port'='3306'
   'scan.incremental.snapshot.chunk.key-column'='external_userid'
   'scan.startup.mode'='initial'
   'table-name'='t_notify_log'
   'username'='sjzx'
   	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372)
   	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
   	at org.apache.flink.client.program.PackagedProgramUtils.getPipelineFromProgram(PackagedProgramUtils.java:158)
   	at org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:82)
   	at org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph(FlinkClientTrait.scala:242)
   	at org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph$(FlinkClientTrait.scala:222)
   	at org.apache.streampark.flink.client.impl.YarnPerJobClient$.doSubmit(YarnPerJobClient.scala:87)
   	at org.apache.streampark.flink.client.trait.FlinkClientTrait.submit(FlinkClientTrait.scala:125)
   	at org.apache.streampark.flink.client.trait.FlinkClientTrait.submit$(FlinkClientTrait.scala:62)
   	at org.apache.streampark.flink.client.impl.YarnPerJobClient$.submit(YarnPerJobClient.scala:40)
   	at org.apache.streampark.flink.client.FlinkClientHandler$.submit(FlinkClientHandler.scala:40)
   	at org.apache.streampark.flink.client.FlinkClientHandler.submit(FlinkClientHandler.scala)
   	... 16 more
   Caused by: org.apache.flink.table.api.ValidationException: Unable to create a source for reading table 'default_catalog.default_database.source_mysqlcdc_t_notify_log'.
   
   Table options are:
   
   'connector'='mysql-cdc'
   'database-name'='yyls_chat'
   'hostname'='172.16.204.59'
   'password'='******'
   'port'='3306'
   'scan.incremental.snapshot.chunk.key-column'='external_userid'
   'scan.startup.mode'='initial'
   'table-name'='t_notify_log'
   'username'='sjzx'
   	at org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:167)
   	at org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:192)
   	at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:175)
   	at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:115)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3619)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2559)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2175)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2095)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2038)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:669)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:657)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3462)
   	at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
   	at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:215)
   	at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:191)
   	at org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1498)
   	at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1253)
   	at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:374)
   	at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:262)
   	at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:106)
   	at org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:703)
   	at com.lvshou.task.CustomerTagTask.main(CustomerTagTask.java:42)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
   	... 27 more
   Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a connector using option: 'connector'='mysql-cdc'
   	at org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:736)
   	at org.apache.flink.table.factories.FactoryUtil.discoverTableFactory(FactoryUtil.java:710)
   	at org.apache.flink.table.factories.FactoryUtil.createDynamicTableSource(FactoryUtil.java:163)
   	... 53 more
   Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'mysql-cdc' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
   
   Available factory identifiers are:
   
   blackhole
   clickhouse-x
   datagen
   filesystem
   jdbc
   kafka
   kafka-x
   print
   sqlserver-cdc
   upsert-kafka
   upsert-kafka-x
   	at org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:546)
   	at org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:732)
   	... 55 more
   `
   
   ### Error Exception
   
   ```log
   aboved.
   ```
   
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!(您是否要贡献这个PR?)
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@streampark.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Re: [I] [Bug] Could not find any factory for identifier 'mysql-cdc' in custom code mode. [incubator-streampark]

Posted by "SbloodyS (via GitHub)" <gi...@apache.org>.
SbloodyS closed issue #3275: [Bug] Could not find any factory for identifier 'mysql-cdc' in custom code mode.
URL: https://github.com/apache/incubator-streampark/issues/3275


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@streampark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Re: [I] [Bug] Could not find any factory for identifier 'mysql-cdc' in custom code mode. [incubator-streampark]

Posted by "SbloodyS (via GitHub)" <gi...@apache.org>.
SbloodyS commented on issue #3275:
URL: https://github.com/apache/incubator-streampark/issues/3275#issuecomment-1777320605

   Fixed by using shade plugin.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@streampark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Re: [I] [Bug] Could not find any factory for identifier 'mysql-cdc' in custom code mode. [incubator-streampark]

Posted by "Tandoy (via GitHub)" <gi...@apache.org>.
Tandoy commented on issue #3275:
URL: https://github.com/apache/incubator-streampark/issues/3275#issuecomment-1886152332

   > Fixed by using shade plugin.
   
   @SbloodyS Hello, can you tell me in detail how to solve it?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@streampark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org