You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2021/01/24 02:34:39 UTC

[GitHub] [flink] mohitpali opened a new pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

mohitpali opened a new pull request #14737:
URL: https://github.com/apache/flink/pull/14737


   <!--
   *Thank you very much for contributing to Apache Flink - we are happy that you want to help us improve Flink. To help the community review your contribution in the best possible way, please go through the checklist below, which will get the contribution into a shape in which it can be best reviewed.*
   
   *Please understand that we do not do this to make contributions to Flink a hassle. In order to uphold a high standard of quality for code contributions, while at the same time managing a large number of contributions, we need contributors to prepare the contributions well, and give reviewers enough contextual information for the review. Please also understand that contributions that do not follow this guide will take longer to review and thus typically be picked up with lower priority by the community.*
   
   ## Contribution Checklist
   
     - Make sure that the pull request corresponds to a [JIRA issue](https://issues.apache.org/jira/projects/FLINK/issues). Exceptions are made for typos in JavaDoc or documentation files, which need no JIRA issue.
     
     - Name the pull request in the form "[FLINK-XXXX] [component] Title of the pull request", where *FLINK-XXXX* should be replaced by the actual issue number. Skip *component* if you are unsure about which is the best component.
     Typo fixes that have no associated JIRA issue should be named following this pattern: `[hotfix] [docs] Fix typo in event time introduction` or `[hotfix] [javadocs] Expand JavaDoc for PuncuatedWatermarkGenerator`.
   
     - Fill out the template below to describe the changes contributed by the pull request. That will give reviewers the context they need to do the review.
     
     - Make sure that the change passes the automated tests, i.e., `mvn clean verify` passes. You can set up Azure Pipelines CI to do that following [this guide](https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository).
   
     - Each pull request should address only one issue, not mix up code from multiple issues.
     
     - Each commit in the pull request has a meaningful commit message (including the JIRA id)
   
     - Once all items of the checklist are addressed, remove the above text and this checklist, leaving only the filled out template below.
   
   
   **(The sections below can be removed for hotfixes of typos)**
   -->
   
   ## What is the purpose of the change
   _The AWS Glue Schema Registry is a new feature of AWS Glue that allows you to centrally discover, control, and evolve data stream schemas. This request is to add a new format to launch an integration for Apache Flink with AWS Glue Schema Registry._
   
   ## Brief change log
   * _Added `flink-avro-glue-schema-registry` module under `flink-formats`_
   * _Added end-to-end test named `flink-glue-schema-registry-test` for the new module_
   
   ## Verifying this change
   This change added tests and can be verified as follows:
   
   * _Added integration tests for end-to-end deployment_
   
   ## Does this pull request potentially affect one of the following parts:
   * Dependencies (does it add or upgrade a dependency): (yes)
   * The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (no)
   * The serializers: (yes)
   * The runtime per-record code paths (performance sensitive): (don't know)
   * Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: (no)
   * The S3 file system connector: (no)
   
   ## Documentation
   * Does this pull request introduce a new feature? (yes)
   * If yes, how is the feature documented? (JavaDocs)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793459808


   My commit needs to be included into the PR so that it ends up in master.
   If this environment variable is not available (for example here during CI verification, or with personal Azure accounts that don't have this secret setup), it will be ignored.
   If the environment variable is set (for example on apache/flink master, my personal CI account or maybe your personal CI account), the tests are executing.
   
   What we need to figure out is why the env variables are not picked up from your test code in my CI environment. It seems that they are set, because the bash script "decided" to run the test. I recommend you to debug / resolve this issue on your personal azure CI account.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bd617892bfec1db1654606355041a6e4b9050304 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-791350113


   > Looks like the test failed: https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8940&view=logs&j=9401bf33-03c4-5a24-83fe-e51d75db73ef&t=72901ab2-7cd0-57be-82b1-bca51de20fba
   
   ```
   2021-03-05T09:34:01.3858012Z Mar 05 09:34:01 2021-03-05 09:33:58,441 ERROR org.apache.flink.client.cli.CliFrontend                      [] - Error while running the command.
   2021-03-05T09:34:01.3859239Z Mar 05 09:34:01 org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: flink-glue-schema-registry-test/src/main/java/resources/avro/user.avsc (No such file or directory)
   2021-03-05T09:34:01.3860519Z Mar 05 09:34:01 	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3862040Z Mar 05 09:34:01 	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3863352Z Mar 05 09:34:01 	at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3864588Z Mar 05 09:34:01 	at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3865802Z Mar 05 09:34:01 	at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3867008Z Mar 05 09:34:01 	at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3868238Z Mar 05 09:34:01 	at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3869570Z Mar 05 09:34:01 	at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28) [flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3870823Z Mar 05 09:34:01 	at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132) [flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3872014Z Mar 05 09:34:01 Caused by: java.io.FileNotFoundException: flink-glue-schema-registry-test/src/main/java/resources/avro/user.avsc (No such file or directory)
   2021-03-05T09:34:01.3872759Z Mar 05 09:34:01 	at java.io.FileInputStream.open0(Native Method) ~[?:1.8.0_282]
   2021-03-05T09:34:01.3873370Z Mar 05 09:34:01 	at java.io.FileInputStream.open(FileInputStream.java:195) ~[?:1.8.0_282]
   2021-03-05T09:34:01.3874051Z Mar 05 09:34:01 	at java.io.FileInputStream.<init>(FileInputStream.java:138) ~[?:1.8.0_282]
   2021-03-05T09:34:01.3874760Z Mar 05 09:34:01 	at com.fasterxml.jackson.core.JsonFactory.createParser(JsonFactory.java:1029) ~[?:?]
   2021-03-05T09:34:01.3875443Z Mar 05 09:34:01 	at org.apache.avro.Schema$Parser.parse(Schema.java:1388) ~[?:?]
   2021-03-05T09:34:01.3876216Z Mar 05 09:34:01 	at org.apache.flink.glue.schema.registry.test.GlueSchemaRegistryExample.getSchema(GlueSchemaRegistryExample.java:96) ~[?:?]
   2021-03-05T09:34:01.3877140Z Mar 05 09:34:01 	at org.apache.flink.glue.schema.registry.test.GlueSchemaRegistryExampleTest.getRecords(GlueSchemaRegistryExampleTest.java:108) ~[?:?]
   2021-03-05T09:34:01.3878145Z Mar 05 09:34:01 	at org.apache.flink.glue.schema.registry.test.GlueSchemaRegistryExampleTest.main(GlueSchemaRegistryExampleTest.java:71) ~[?:?]
   2021-03-05T09:34:01.3878917Z Mar 05 09:34:01 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_282]
   2021-03-05T09:34:01.3879654Z Mar 05 09:34:01 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_282]
   2021-03-05T09:34:01.3880463Z Mar 05 09:34:01 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_282]
   2021-03-05T09:34:01.3881354Z Mar 05 09:34:01 	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_282]
   2021-03-05T09:34:01.3882522Z Mar 05 09:34:01 	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) ~[flink-dist_2.11-1.13-SNAPSHOT.jar:1.13-SNAPSHOT]
   2021-03-05T09:34:01.3883225Z Mar 05 09:34:01 	... 8 more
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-780601429


   I am seeing a test failure when running `mvn clean install` on the `flink-avro-glue-schema-registry`:
   
   ```
   [ERROR] Errors:
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [INFO]
   [ERROR] Tests run: 19, Failures: 0, Errors: 5, Skipped: 0
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] mohitpali edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
mohitpali edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-767211338


   Apologies for the confusion, we have closed the other PR. We had to create another PR because two developers were working on it and hence the different login. I have included some CI compilation fixes in this PR and rebased.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 removed a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 removed a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766989857


   It's the same one but fixing the compiling error.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-792650616


   https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8949&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794233139


   Have you setup your CI with the password as well, and verified the change?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] mohitpali commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
mohitpali commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-767211338


   Apologies for the confusion, we have close the other PR. We had to create another PR because two developers were working on it and hence the different login. I have included some CI compilation fixes in this PR and rebased.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793563537


   > > It looks like the GSR test is running even when the creds are not set. Can we try setting:
   > > ```
   > > if [ -n "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ -n "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
   > >   run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   > > fi
   > > ```
   > > 
   > > 
   > > to
   > > ```
   > > if [ "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
   > >   run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   > > fi
   > > ```
   > 
   > Updated
   
   After diving into this, I find that `-n` means whether a string is empty, which is same with the length is zero. So we can use it for the condition check. The problem is we should use `[[ ]]` instead of `[ ]` to ensure 
   
   > The `-z` approach should work: https://github.com/apache/flink/blob/master/flink-end-to-end-tests/test-scripts/common_s3.sh#L25
   > but it's worth a try.
   > 
   > Can you also add
   > 
   > ```
   > SECRET_S3_ACCESS_KEY: $[variables.IT_CASE_S3_ACCESS_KEY]
   > SECRET_S3_SECRET_KEY: $[variables.IT_CASE_S3_SECRET_KEY]
   > ```
   > 
   > to `build-apache-repo.yml`?
   
   Yes, I've added in the latest commit.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-790878966


   I pushed your rebased branch also again: https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8938&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578805957



##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,72 @@
+#!/usr/bin/env bash

Review comment:
       @LinyuYao1021 Please update [run-nightly-tests.sh](https://github.com/apache/flink/blob/master/flink-end-to-end-tests/run-nightly-tests.sh) to include this test




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * a87a26aa218658f7098367cae8c7a2ed18430296 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * c77e3d4a1a7484c4f1e24b23a1099364b834cf75 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153) 
   * a87a26aa218658f7098367cae8c7a2ed18430296 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577863709



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       The change may be lost during rebasing. Will add again in next commit.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578014029



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org/apache/flink/glue/schema/registry/test/GlueSchemaRegistryExample.java
##########
@@ -0,0 +1,110 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.java.utils.ParameterTool;
+import org.apache.flink.formats.avro.glue.schema.registry.GlueSchemaRegistryAvroDeserializationSchema;
+import org.apache.flink.formats.avro.glue.schema.registry.GlueSchemaRegistryAvroSerializationSchema;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer;
+import org.apache.flink.streaming.connectors.kinesis.FlinkKinesisProducer;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.kafka.test.base.KafkaExampleUtil;
+
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+
+import java.io.File;
+import java.io.IOException;
+import java.net.URL;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Properties;
+
+/**
+ * A simple example that shows how to read from and write to Kinesis. This will read Avro messages
+ * from the input topic, and finally write back to another topic.

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574346915



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       If it is only required for tests within this module, please add a `<scope>test</scope>`. However since it compiles without, it might be required by a downstream module instead for tests, and it can be added there.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r575091145



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       Then customer should add the dependency in their package. We should target minimum set of dependencies, only import the things we need.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794238904


   > password
   
   Not yet, I haven't used Azure before. Could you quickly walk me through what to do to set up CI?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14504",
       "triggerID" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * bafd3a9a41461850057e9f2d50c3cea1c52aad7a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340) 
   * c4eb439a79d18f8296055e3582a0093146cbacc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14504) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793481182


   > My commit needs to be included into the PR so that it ends up in master.
   > If this environment variable is not available (for example here during CI verification, or with personal Azure accounts that don't have this secret setup), it will be ignored.
   > If the environment variable is set (for example on apache/flink master, my personal CI account or maybe your personal CI account), the tests are executing.
   > 
   > What we need to figure out is why the env variables are not picked up from your test code in my CI environment. It seems that they are set, because the bash script "decided" to run the test. I recommend you to debug / resolve this issue on your personal azure CI account.
   
   Currently, both main CI and your CI are not working. For main CI, it doesn't skip my e2e test. For your one, it can't pick the env variables. I dive deep in the secret forwarding and have a question about this [part](https://github.com/rmetzger/flink/commit/697c40cad14f42119604b3e754c4a52ede4a5c82#diff-7915b9b726a397ae7ba6af7b9703633d21c031ebf21682f3ee7e6a4ec52837a5R59). Why we need to pass the value of `$IT_CASE_GLUE_SCHEMA_ACCESS_KEY` to `$SECRET_GLUE_SCHEMA_ACCESS_KEY`? My understanding for secret forwarding is that we will have env variable `$SECRET_GLUE_SCHEMA_ACCESS_KEY` on personal CI and pass it to `$IT_CASE_GLUE_SCHEMA_ACCESS_KEY.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578255800



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoder.java
##########
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import org.apache.avro.Schema;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.Map;
+
+/**
+ * Schema coder that allows reading schema that is somehow embedded into serialized record. Used by
+ * {@link GlueSchemaRegistryAvroDeserializationSchema} and {@link
+ * GlueSchemaRegistryAvroSerializationSchema}.
+ */
+public class GlueSchemaRegistryAvroSchemaCoder implements SchemaCoder {
+    private GlueSchemaRegistryInputStreamDeserializer glueSchemaRegistryInputStreamDeserializer;
+    private GlueSchemaRegistryOutputStreamSerializer glueSchemaRegistryOutputStreamSerializer;
+
+    /**
+     * Constructor accepts transport name and configuration map for AWS Glue Schema Registry.
+     *
+     * @param transportName topic name or stream name etc.
+     * @param configs configurations for AWS Glue Schema Registry
+     */
+    public GlueSchemaRegistryAvroSchemaCoder(
+            final String transportName, final Map<String, Object> configs) {
+        glueSchemaRegistryInputStreamDeserializer =
+                new GlueSchemaRegistryInputStreamDeserializer(configs);
+        glueSchemaRegistryOutputStreamSerializer =
+                new GlueSchemaRegistryOutputStreamSerializer(transportName, configs);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryInputStreamDeserializer
+                    glueSchemaRegistryInputStreamDeserializer) {
+        this.glueSchemaRegistryInputStreamDeserializer = glueSchemaRegistryInputStreamDeserializer;
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryOutputStreamSerializer
+                    glueSchemaRegistryOutputStreamSerializer) {
+        this.glueSchemaRegistryOutputStreamSerializer = glueSchemaRegistryOutputStreamSerializer;
+    }
+
+    @Override
+    public Schema readSchema(InputStream in) throws IOException {
+        return glueSchemaRegistryInputStreamDeserializer.getSchemaAndDeserializedStream(in);
+    }
+
+    @Override
+    public void writeSchema(Schema schema, OutputStream out) throws IOException {
+        byte[] data = ((ByteArrayOutputStream) out).toByteArray();

Review comment:
       The Precondition check should go before the cast




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574115398



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryInputStreamDeserializer.java
##########
@@ -0,0 +1,85 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.utils.MutableByteArrayInputStream;
+
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import org.apache.avro.Schema;
+import org.apache.avro.SchemaParseException;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry input stream de-serializer to accept input stream and extract schema
+ * from it and remove schema registry information in the input stream.
+ */
+public class GlueSchemaRegistryInputStreamDeserializer {
+    private final AWSDeserializer awsDeserializer;
+
+    /**
+     * Constructor accepts configuration map for AWS Deserializer.
+     *
+     * @param configs configuration map
+     */
+    public GlueSchemaRegistryInputStreamDeserializer(Map<String, Object> configs) {
+        awsDeserializer =
+                AWSDeserializer.builder()
+                        .credentialProvider(DefaultCredentialsProvider.builder().build())
+                        .configs(configs)
+                        .build();
+    }
+
+    public GlueSchemaRegistryInputStreamDeserializer(AWSDeserializer awsDeserializer) {
+        this.awsDeserializer = awsDeserializer;
+    }
+
+    /**
+     * Get schema and remove extra Schema Registry information within input stream.
+     *
+     * @param in input stream
+     * @return schema of object within input stream
+     * @throws IOException Exception during decompression
+     */
+    public Schema getSchemaAndDeserializedStream(InputStream in) throws IOException {
+        byte[] inputBytes = new byte[in.available()];
+        in.read(inputBytes);
+        in.reset();
+
+        MutableByteArrayInputStream mutableByteArrayInputStream = (MutableByteArrayInputStream) in;
+        String schemaDefinition = awsDeserializer.getSchema(inputBytes).getSchemaDefinition();
+        byte[] deserializedBytes = awsDeserializer.getActualData(inputBytes);
+        mutableByteArrayInputStream.setBuffer(deserializedBytes);
+
+        Schema schema;
+        try {
+            schema = (new Schema.Parser()).parse(schemaDefinition);

Review comment:
       The use of schema parsing is because GSR return its own defined `Schema` class from serialized byte array.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-791962050


   https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8944&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148) Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112) 
   * c77e3d4a1a7484c4f1e24b23a1099364b834cf75 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 81adc9cccd9fc0247e58ad7688252e1d98382cc5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401) 
   * 56566c305fba167267cf427dddbf33c62b04f997 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766989857






----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r587221640



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/User.java
##########
@@ -0,0 +1,434 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.avro.message.BinaryMessageDecoder;
+import org.apache.avro.message.BinaryMessageEncoder;
+import org.apache.avro.message.SchemaStore;
+import org.apache.avro.specific.SpecificData;
+
+@SuppressWarnings("all")
+@org.apache.avro.specific.AvroGenerated
+public class User extends org.apache.avro.specific.SpecificRecordBase

Review comment:
       Okay I misunderstood what the dependency does. Will address this in follow up PR.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d6f06f07e0895117b345b99533ba7eda672ba765 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489) 
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578009632



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/User.java
##########
@@ -0,0 +1,434 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.avro.message.BinaryMessageDecoder;
+import org.apache.avro.message.BinaryMessageEncoder;
+import org.apache.avro.message.SchemaStore;
+import org.apache.avro.specific.SpecificData;
+
+@SuppressWarnings("all")
+@org.apache.avro.specific.AvroGenerated
+public class User extends org.apache.avro.specific.SpecificRecordBase

Review comment:
       Currently, there's no `avro` file can be used directly. So this file is still needed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-799565686


   CI is passing with and without WS credentials. I will merge this now:
   - https://dev.azure.com/georgeryan1322/Flink/_build/results?buildId=360&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574177985



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       It'll cause `ClassNotFoundException` if adding this scope.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574343844



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderTest.java
##########
@@ -0,0 +1,287 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import com.amazonaws.services.schemaregistry.caching.AWSSchemaRegistrySerializerCache;
+import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient;
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import org.apache.avro.Schema;
+import org.hamcrest.Matchers;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.EnumSource;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.services.glue.model.EntityNotFoundException;
+
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.io.InputStream;
+import java.lang.reflect.Field;
+import java.nio.ByteBuffer;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.CoreMatchers.notNullValue;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.junit.jupiter.api.Assertions.assertThrows;
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.ArgumentMatchers.anyMap;
+import static org.mockito.ArgumentMatchers.anyString;
+import static org.mockito.Mockito.doCallRealMethod;
+import static org.mockito.Mockito.spy;
+import static org.mockito.Mockito.when;

Review comment:
       I fell foul to this on my first contribution too. 
   
   The guidelines are [here](https://flink.apache.org/contributing/code-style-and-quality-common.html), see "Avoid Mockito".
   
   For for KDS integration we have a [Behaviour factory](https://github.com/apache/flink/blob/master/flink-connectors/flink-connector-kinesis/src/test/java/org/apache/flink/streaming/connectors/kinesis/testutils/FakeKinesisBehavioursFactory.java) that builds mocks for you with Fake behaviours.
   
   If you are struggling on any particular example let me know and I will help. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766724268






----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577486660



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderProvider.java
##########
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.NonNull;

Review comment:
       The `NonNull` annotation is to make sure that users don't pass null for `configs`

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSerializationSchema.java
##########
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.RegistryAvroSerializationSchema;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.SneakyThrows;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.avro.io.Encoder;
+import org.apache.avro.specific.SpecificRecord;
+
+import javax.annotation.Nullable;
+
+import java.io.ByteArrayOutputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry Serialization schema to serialize to Avro binary format for Flink
+ * Producer user.
+ *
+ * @param <T> the type to be serialized
+ */
+public class GlueSchemaRegistryAvroSerializationSchema<T>
+        extends RegistryAvroSerializationSchema<T> {
+    /**
+     * Creates an Avro serialization schema.
+     *
+     * @param recordClazz class to serialize. Should be one of: {@link SpecificRecord}, {@link
+     *     GenericRecord}.
+     * @param reader reader's Avro schema. Should be provided if recordClazz is {@link
+     *     GenericRecord}
+     * @param schemaCoderProvider schema coder provider which reads writer schema from AWS Glue
+     *     Schema Registry
+     */
+    private GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz,
+            @Nullable Schema reader,
+            SchemaCoder.SchemaCoderProvider schemaCoderProvider) {
+        super(recordClazz, reader, schemaCoderProvider);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz, @Nullable Schema reader, SchemaCoder schemaCoder) {
+        // Pass null schema coder provider
+        super(recordClazz, reader, null);
+        this.schemaCoder = schemaCoder;
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * GenericRecord} using provided schema.
+     *
+     * @param schema the schema that will be used for serialization
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of AWS Glue Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static GlueSchemaRegistryAvroSerializationSchema<GenericRecord> forGeneric(
+            Schema schema, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                GenericRecord.class,
+                schema,
+                new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * SpecificRecord} using provided schema.
+     *
+     * @param clazz the type to be serialized
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of Amazon Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static <T extends SpecificRecord>
+            GlueSchemaRegistryAvroSerializationSchema<T> forSpecific(
+                    Class<T> clazz, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                clazz, null, new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Serializes the incoming element to a byte array containing bytes of AWS Glue Schema registry
+     * information.
+     *
+     * @param object The incoming element to be serialized
+     * @return The serialized bytes.
+     */
+    @SneakyThrows

Review comment:
       Fixed

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       Fixed

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574033549



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>software.amazon.glue</groupId>
+			<artifactId>schema-registry-serde</artifactId>
+			<version>${glue.schema.registry.version}</version>
+		</dependency>
+
+		<!-- test dependencies -->
+
+		<dependency>
+			<groupId>org.junit.jupiter</groupId>
+			<artifactId>junit-jupiter-api</artifactId>
+			<version>${junit.jupiter.version}</version>
+			<scope>test</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.junit.jupiter</groupId>
+			<artifactId>junit-jupiter-params</artifactId>
+			<version>${junit.jupiter.version}</version>
+			<scope>test</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.mockito</groupId>
+			<artifactId>mockito-junit-jupiter</artifactId>
+			<version>${mockito.version}</version>
+			<scope>test</scope>
+		</dependency>

Review comment:
       I'm using a different `junit` version to support `mockito`. Since `mockito` is not recommended, I'll change the version.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574342605



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryInputStreamDeserializer.java
##########
@@ -0,0 +1,85 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.utils.MutableByteArrayInputStream;
+
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import org.apache.avro.Schema;
+import org.apache.avro.SchemaParseException;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry input stream de-serializer to accept input stream and extract schema
+ * from it and remove schema registry information in the input stream.
+ */
+public class GlueSchemaRegistryInputStreamDeserializer {
+    private final AWSDeserializer awsDeserializer;
+
+    /**
+     * Constructor accepts configuration map for AWS Deserializer.
+     *
+     * @param configs configuration map
+     */
+    public GlueSchemaRegistryInputStreamDeserializer(Map<String, Object> configs) {
+        awsDeserializer =
+                AWSDeserializer.builder()
+                        .credentialProvider(DefaultCredentialsProvider.builder().build())
+                        .configs(configs)
+                        .build();
+    }
+
+    public GlueSchemaRegistryInputStreamDeserializer(AWSDeserializer awsDeserializer) {
+        this.awsDeserializer = awsDeserializer;
+    }
+
+    /**
+     * Get schema and remove extra Schema Registry information within input stream.
+     *
+     * @param in input stream
+     * @return schema of object within input stream
+     * @throws IOException Exception during decompression
+     */
+    public Schema getSchemaAndDeserializedStream(InputStream in) throws IOException {
+        byte[] inputBytes = new byte[in.available()];
+        in.read(inputBytes);
+        in.reset();
+
+        MutableByteArrayInputStream mutableByteArrayInputStream = (MutableByteArrayInputStream) in;
+        String schemaDefinition = awsDeserializer.getSchema(inputBytes).getSchemaDefinition();
+        byte[] deserializedBytes = awsDeserializer.getActualData(inputBytes);
+        mutableByteArrayInputStream.setBuffer(deserializedBytes);
+
+        Schema schema;
+        try {
+            schema = (new Schema.Parser()).parse(schemaDefinition);

Review comment:
       The question was focusing on the frequency of deserialisation. To improve performance can we deserialise schema for specific info once, and cache it? Or are we expecting the schema definition to change over time? What happens if the schema changes for a `SpecificRecord`? Is the idea that the Flink job would fail if the upstream data format changes in a non compatible way?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793146783


   > There are no secrets setup for the main CI (which runs the pull request validation). We can not do this, because people could steal our credentials by opening a pull request exporting the secrets from the env variables.
   > My Azure account has the secrets set up.
   > I would recommend you to do the same on your personal Azure account (It's free).
   
   So do you mean that I shouldn't include your [commit](https://github.com/rmetzger/flink/commit/697c40cad14f42119604b3e754c4a52ede4a5c82) to include secret forwarding in main CI? This commit is only for your or my personal testing?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r583535654



##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,78 @@
+#!/usr/bin/env bash
+################################################################################
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+################################################################################
+# To run this test locally, AWS credential is required.

Review comment:
       @LinyuYao1021 I am trying to run this using the policy Mohit provided (the one we are deploying to the CI server):
   
   ```
   {
       "Version": "2012-10-17",
       "Statement": [
           {
               "Sid": "AWSGlueSchemaRegistryFullAccess",
               "Effect": "Allow",
               "Action": [
                   "glue:CreateRegistry",
                   "glue:UpdateRegistry",
                   "glue:DeleteRegistry",
                   "glue:GetRegistry",
                   "glue:ListRegistries",
                   "glue:CreateSchema",
                   "glue:UpdateSchema",
                   "glue:DeleteSchema",
                   "glue:GetSchema",
                   "glue:ListSchemas",
                   "glue:RegisterSchemaVersion",
                   "glue:DeleteSchemaVersions",
                   "glue:GetSchemaByDefinition",
                   "glue:GetSchemaVersion",
                   "glue:GetSchemaVersionsDiff",
                   "glue:ListSchemaVersions",
                   "glue:CheckSchemaVersionValidity",
                   "glue:PutSchemaVersionMetadata",
                   "glue:RemoveSchemaVersionMetadata",
                   "glue:QuerySchemaVersionMetadata"
               ],
               "Resource": [
                   "*"
               ]
           },
           {
               "Sid": "AWSGlueSchemaRegistryTagsFullAccess",
               "Effect": "Allow",
               "Action": [
                   "glue:GetTags",
                   "glue:TagResource",
                   "glue:UnTagResource"
               ],
               "Resource": [
                   "arn:aws:glue:*:*:schema/*",
                   "arn:aws:glue:*:*:registry/*"
               ]
           }
       ]
   }
   ```
   
   I am seeing the following error:
   
   `[FAIL] 'test-scripts/test_glue_schema_registry.sh' failed after 1 minutes and 7 seconds! Test exited with exit code 1 and the logs contained errors, exceptions or non-empty .out files`
   
   ```
   2021-02-26 10:16:24,847 WARN  org.apache.flink.runtime.taskmanager.Task                    [] - Source: Custom Source -> Sink: Unnamed (1/1)#1 (fef59671d3d0090a9195d35cbe68f2db) switched from RUNNING to FAILED with failure cause: org.apache.flink.kinesis.shaded.com.amazonaws.AbortedException:
   	at org.apache.flink.kinesis.shaded.com.amazonaws.internal.SdkFilterInputStream.abortIfNeeded(SdkFilterInputStream.java:61)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:89)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.event.ProgressInputStream.read(ProgressInputStream.java:180)
   	at com.fasterxml.jackson.core.json.ByteSourceJsonBootstrapper.ensureLoaded(ByteSourceJsonBootstrapper.java:539)
   	at com.fasterxml.jackson.core.json.ByteSourceJsonBootstrapper.detectEncoding(ByteSourceJsonBootstrapper.java:133)
   	at com.fasterxml.jackson.core.json.ByteSourceJsonBootstrapper.constructParser(ByteSourceJsonBootstrapper.java:256)
   	at com.fasterxml.jackson.core.JsonFactory._createParser(JsonFactory.java:1656)
   	at com.fasterxml.jackson.core.JsonFactory.createParser(JsonFactory.java:1085)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.JsonResponseHandler.handle(JsonResponseHandler.java:109)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.JsonResponseHandler.handle(JsonResponseHandler.java:43)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.response.AwsResponseHandlerAdapter.handle(AwsResponseHandlerAdapter.java:69)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleResponse(AmazonHttpClient.java:1714)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleSuccessResponse(AmazonHttpClient.java:1434)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1356)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1139)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:796)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:764)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:738)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:698)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:680)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:544)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:524)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:2809)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:2776)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:2765)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.executeGetShardIterator(AmazonKinesisClient.java:1396)
   	at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.getShardIterator(AmazonKinesisClient.java:1367)
   	at org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.getShardIterator(KinesisProxy.java:381)
   	at org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.getShardIterator(KinesisProxy.java:371)
   	at org.apache.flink.streaming.connectors.kinesis.internals.publisher.polling.PollingRecordPublisher.getShardIterator(PollingRecordPublisher.java:186)
   	at org.apache.flink.streaming.connectors.kinesis.internals.publisher.polling.PollingRecordPublisher.<init>(PollingRecordPublisher.java:95)
   	at org.apache.flink.streaming.connectors.kinesis.internals.publisher.polling.PollingRecordPublisherFactory.create(PollingRecordPublisherFactory.java:86)
   	at org.apache.flink.streaming.connectors.kinesis.internals.publisher.polling.PollingRecordPublisherFactory.create(PollingRecordPublisherFactory.java:34)
   	at org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.createRecordPublisher(KinesisDataFetcher.java:496)
   	at org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.createShardConsumer(KinesisDataFetcher.java:465)
   	at org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.runFetcher(KinesisDataFetcher.java:592)
   	at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.run(FlinkKinesisConsumer.java:392)
   	at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
   	at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66)
   	at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:257)
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bd617892bfec1db1654606355041a6e4b9050304 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419) 
   * d6f06f07e0895117b345b99533ba7eda672ba765 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-796915630


   Failed, tweaking the test and retry:
   - https://dev.azure.com/georgeryan1322/Flink/_build/results?buildId=352&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-791455142


   Also note that you need to include this commit if you want to make the secrets forwarding work: https://github.com/rmetzger/flink/commit/697c40cad14f42119604b3e754c4a52ede4a5c82
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-790872203


   > Thanks for your review @dannycranmer!
   > 
   > Before we can merge this PR, we need to make sure that CI is passing without credentials (this is the case in the PR CI validation), and with credentials (my personal CI).
   
   Hi Robert, I added `if` condition to make e2e test only runs when credentials are available. Then CI keeps failing due to
   [[FAIL] 'Run kubernetes session test (default input)' failed after 1 minutes and 58 seconds! Test exited with exit code 1](https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=14148&view=logs&j=c88eea3b-64a0-564d-0031-9fdcd7b8abee&t=ff888d9b-cd34-53cc-d90f-3e446d355529&l=2236). Do you think it can fix the issue by merging the latest commit?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793560044


   The `-z` approach should work: https://github.com/apache/flink/blob/master/flink-end-to-end-tests/test-scripts/common_s3.sh#L25
   but it's worth a try. 
   
   Can you also add 
   ```
   SECRET_S3_ACCESS_KEY: $[variables.IT_CASE_S3_ACCESS_KEY]
   SECRET_S3_SECRET_KEY: $[variables.IT_CASE_S3_SECRET_KEY]
   ```
   to `build-apache-repo.yml`?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794508193


   > It failed again: https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8955&view=logs&j=9401bf33-03c4-5a24-83fe-e51d75db73ef&t=72901ab2-7cd0-57be-82b1-bca51de20fba
   
   It's still because credentials can't be extracted. How did you success last time?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944) 
   * f793772656ad942f463a23b0dd43f3522f147493 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-790375151


   Looks like the e2e test is failing. Can you write the e2e test in a way that it only executes if the credentials are available? (personal Azure accounts won't have credentials)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * cb96570901cbca2a6c9fdefc98c4154839194fc1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335) 
   * bafd3a9a41461850057e9f2d50c3cea1c52aad7a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577483869



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,150 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test_${scala.binary.version}</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.9</httpclient.version>
+		<httpcore.version>4.4.11</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-kinesis-test_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro-glue-schema-registry</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>junit</groupId>
+			<artifactId>junit</artifactId>
+			<version>${junit.version}</version>
+			<scope>compile</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<type>test-jar</type>
+			<scope>compile</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.httpcomponents</groupId>
+			<artifactId>httpclient</artifactId>
+			<version>${httpclient.version}</version>
+		</dependency>
+		<dependency>
+			<groupId>org.apache.httpcomponents</groupId>
+			<artifactId>httpcore</artifactId>
+			<version>${httpcore.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.httpcomponents</groupId>
+			<artifactId>httpcore</artifactId>
+			<version>${httpcore.version}</version>
+		</dependency>

Review comment:
       Fixed

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createTopic(String stream, int shards, Properties props) throws Exception {

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766989857


   It's the same one but fixing the compiling error.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 removed a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 removed a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766989857


   It's the same one but fixing the compiling error.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 061978aeb474636d954f85b0408b00a21f2571e5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577864934



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>

Review comment:
       The dependency chain is fixed in GSR package but it'll need some time to release. Once it's out, it should also fix the enforcer check issue.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 399f06e14079a512d35814508b6f7598d7d175ba Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417) 
   * bd617892bfec1db1654606355041a6e4b9050304 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794239401


   > Have you setup your CI with the password as well, and verified the change?
   
   Not yet, I haven't used Azure before. Could you quickly walk me through what to do to set up CI?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-790802979


   @flinkbot run azure


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-796665243


   I was concerned that this would happen to you. A new hire in our company is facing the same issue.
   Looks like Azure is having some issues with cryptocurrency mining on their Infra, but they lack the proper infra to detect it.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578008770



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryOutputStreamSerializerTest.java
##########
@@ -0,0 +1,132 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import org.apache.avro.Schema;
+import org.junit.Before;
+import org.junit.Test;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.instanceOf;
+
+/** Tests for {@link GlueSchemaRegistryOutputStreamSerializer}. */
+public class GlueSchemaRegistryOutputStreamSerializerTest {

Review comment:
       Updated




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * f793772656ad942f463a23b0dd43f3522f147493 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953) 
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 97864e82641279ac227eda19b938ebce62262867 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r594371070



##########
File path: flink-end-to-end-tests/test-scripts/common.sh
##########
@@ -364,6 +364,7 @@ function check_logs_for_errors {
       | grep -v "HeapDumpOnOutOfMemoryError" \
       | grep -v "error_prone_annotations" \
       | grep -v "Error sending fetch request" \
+      | grep -v "WARN  akka.remote.ReliableDeliverySupervisor" \

Review comment:
       @LinyuYao1021 Do we still need this change?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r583546283



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoder.java
##########
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import org.apache.avro.Schema;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.Map;
+
+/**
+ * Schema coder that allows reading schema that is somehow embedded into serialized record. Used by
+ * {@link GlueSchemaRegistryAvroDeserializationSchema} and {@link
+ * GlueSchemaRegistryAvroSerializationSchema}.
+ */
+public class GlueSchemaRegistryAvroSchemaCoder implements SchemaCoder {
+    private GlueSchemaRegistryInputStreamDeserializer glueSchemaRegistryInputStreamDeserializer;
+    private GlueSchemaRegistryOutputStreamSerializer glueSchemaRegistryOutputStreamSerializer;
+
+    /**
+     * Constructor accepts transport name and configuration map for AWS Glue Schema Registry.
+     *
+     * @param transportName topic name or stream name etc.
+     * @param configs configurations for AWS Glue Schema Registry
+     */
+    public GlueSchemaRegistryAvroSchemaCoder(
+            final String transportName, final Map<String, Object> configs) {
+        glueSchemaRegistryInputStreamDeserializer =
+                new GlueSchemaRegistryInputStreamDeserializer(configs);
+        glueSchemaRegistryOutputStreamSerializer =
+                new GlueSchemaRegistryOutputStreamSerializer(transportName, configs);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryInputStreamDeserializer
+                    glueSchemaRegistryInputStreamDeserializer) {
+        this.glueSchemaRegistryInputStreamDeserializer = glueSchemaRegistryInputStreamDeserializer;
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryOutputStreamSerializer
+                    glueSchemaRegistryOutputStreamSerializer) {
+        this.glueSchemaRegistryOutputStreamSerializer = glueSchemaRegistryOutputStreamSerializer;
+    }
+
+    @Override
+    public Schema readSchema(InputStream in) throws IOException {
+        return glueSchemaRegistryInputStreamDeserializer.getSchemaAndDeserializedStream(in);
+    }
+
+    @Override
+    public void writeSchema(Schema schema, OutputStream out) throws IOException {
+        byte[] data = ((ByteArrayOutputStream) out).toByteArray();

Review comment:
       @LinyuYao1021 this comment is still open




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-796643342


   Current status is that my new Azure account is blocked waiting for limit increase to run parallel builds. Until this is complete, I cannot verify the e2e tests. I have sent an email to azure as described in the docs, and am waiting for a response.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 97864e82641279ac227eda19b938ebce62262867 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316) 
   * 5779d8d7942ed9d747a4f07043c3fad3d1ff82f0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r587207722



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/User.java
##########
@@ -0,0 +1,434 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.avro.message.BinaryMessageDecoder;
+import org.apache.avro.message.BinaryMessageEncoder;
+import org.apache.avro.message.SchemaStore;
+import org.apache.avro.specific.SpecificData;
+
+@SuppressWarnings("all")
+@org.apache.avro.specific.AvroGenerated
+public class User extends org.apache.avro.specific.SpecificRecordBase

Review comment:
       But isn't this file generated from the schema defined in `flink-formats/flink-avro-glue-schema-registry/src/test/java/resources/avro/user.avsc`?
   The `avro-maven-plugin` can generated this `User.java` file based on `user.avsc`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r594496214



##########
File path: flink-end-to-end-tests/test-scripts/common.sh
##########
@@ -364,6 +364,7 @@ function check_logs_for_errors {
       | grep -v "HeapDumpOnOutOfMemoryError" \
       | grep -v "error_prone_annotations" \
       | grep -v "Error sending fetch request" \
+      | grep -v "WARN  akka.remote.ReliableDeliverySupervisor" \

Review comment:
       This is to avoid failure under this scenario:
   
   ```
   2021-03-11T23:25:41.9106886Z Mar 11 23:25:41 2021-03-11 23:25:39,736 WARN  akka.remote.ReliableDeliverySupervisor                       [] - Association with remote system [akka.tcp://flink-metrics@10.1.0.4:37981] has failed, address is now gated for [50] ms. Reason: [Disassociated] 
   2021-03-11T23:25:41.9108202Z Mar 11 23:25:41 2021-03-11 23:25:39,747 WARN  akka.remote.ReliableDeliverySupervisor                       [] - Association with remote system [akka.tcp://flink@10.1.0.4:37839] has failed, address is now gated for [50] ms. Reason: [Disassociated] 
   2021-03-11T23:25:41.9109453Z Mar 11 23:25:41 2021-03-11 23:25:40,010 WARN  akka.remote.transport.netty.NettyTransport                   [] - Remote connection to [null] failed with java.net.ConnectException: Connection refused: /10.1.0.4:37839
   2021-03-11T23:25:41.9111511Z Mar 11 23:25:41 2021-03-11 23:25:40,010 WARN  akka.remote.ReliableDeliverySupervisor                       [] - Association with remote system [akka.tcp://flink@10.1.0.4:***@10.1.0.4:37839]] Caused by: [java.net.ConnectException: Connection refused: /10.1.0.4:37839]
   ```
   
   We already ignore `grep -v "WARN  akka.remote.transport.netty.NettyTransport"`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794285369


   Here's how to setup azure: https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793572293


   > https://github.com/apache/flink/blob/master/flink-end-to-end-tests/test-scripts/common_s3.sh#L25
   
   Why the s3 access key is need to be added?
   
   > The `-z` approach should work: https://github.com/apache/flink/blob/master/flink-end-to-end-tests/test-scripts/common_s3.sh#L25
   > but it's worth a try.
   > 
   > Can you also add
   > 
   > ```
   > SECRET_S3_ACCESS_KEY: $[variables.IT_CASE_S3_ACCESS_KEY]
   > SECRET_S3_SECRET_KEY: $[variables.IT_CASE_S3_SECRET_KEY]
   > ```
   > 
   > to `build-apache-repo.yml`?
   
   Why the s3 access key is needed for my case?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] jiamo commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
jiamo commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-803882944


   A little question. We know with this repe https://github.com/awslabs/aws-glue-data-catalog-client-for-apache-hive-metastore .AWS EME Hive can seamless talk with glue meta.
   But when use flink hive. It use the original metastore client.
   Is It possible to make an option that flink hive can talk with glue meta. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793576430


   Sorry, I meant 
   ```
     SECRET_GLUE_SCHEMA_ACCESS_KEY: $[variables.IT_CASE_GLUE_SCHEMA_ACCESS_KEY]
     SECRET_GLUE_SCHEMA_SECRET_KEY: $[variables.IT_CASE_GLUE_SCHEMA_SECRET_KEY]
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-795087810


   I have setup my pipeline, running master to verify it works, then I will run your GSR branch:
   - https://dev.azure.com/dannycranmer/Flink/_build/results?buildId=4&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574341163



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       Where are you seeing `ClassNotFoundExcception`? At runtime the Class would be provided by the Flink cluster. You may see an issue running in standalone mode. The problem with not making this provided is that when building an uber jar for an app, it could bundle additional unnecessary Flink code into the jar. This would bloat the jar size and classpath.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395) 
   * 86353afe862e41b51dc48caf19f48fc03e6246b0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577638986



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org/apache/flink/glue/schema/registry/test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,182 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createStream(String stream, int shards, Properties props) throws Exception {
+        try {
+            kinesisClient.describeStream(stream);
+            kinesisClient.deleteStream(stream);
+        } catch (ResourceNotFoundException rnfe) {
+        }
+
+        kinesisClient.createStream(stream, shards);
+        Deadline deadline = Deadline.fromNow(Duration.ofSeconds(5));
+        while (deadline.hasTimeLeft()) {
+            try {
+                Thread.sleep(250);
+                if (kinesisClient.describeStream(stream).getStreamDescription().getShards().size()
+                        != shards) {
+                    continue;
+                }
+                break;
+            } catch (ResourceNotFoundException rnfe) {

Review comment:
       nit: Add a comment to say this is expected/ok. At a glance it looks like it is missing error handling

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderProvider.java
##########
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.NonNull;

Review comment:
       Yes I meant you are using `lombok` rather than `javax.annotation`. But I think this is not needed since Flink coding standards say everything is `nonnull` by default:
   - https://flink.apache.org/contributing/code-style-and-quality-common.html#nullability-of-the-mutable-parts
   
   This also opens the question as to why the `transportName` is not `@NonNull`? Is it `@Nullable`? 
   
   Please remove the annotations unless there is a good reason to keep them. 

##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,72 @@
+#!/usr/bin/env bash

Review comment:
       https://issues.apache.org/jira/browse/FLINK-21391

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>

Review comment:
       This comment looks misplaced. This is not related to the enforcer skip, it is looking at the transitive dependency chain. Did you reply to the wrong thread?

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,252 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.3</httpclient.version>
+		<httpcore.version>4.4.6</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>

Review comment:
       Why is there a dependency management block here? Unless I am mistaken `dependencyManagement` is used to setup dependencies for child modules, that include this pom as their parent. How is this being used in this context?

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       What is fixed? I do not see any changes?

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoder.java
##########
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import org.apache.avro.Schema;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.Map;
+
+/**
+ * Schema coder that allows reading schema that is somehow embedded into serialized record. Used by
+ * {@link GlueSchemaRegistryAvroDeserializationSchema} and {@link
+ * GlueSchemaRegistryAvroSerializationSchema}.
+ */
+public class GlueSchemaRegistryAvroSchemaCoder implements SchemaCoder {
+    private GlueSchemaRegistryInputStreamDeserializer glueSchemaRegistryInputStreamDeserializer;
+    private GlueSchemaRegistryOutputStreamSerializer glueSchemaRegistryOutputStreamSerializer;
+
+    /**
+     * Constructor accepts transport name and configuration map for AWS Glue Schema Registry.
+     *
+     * @param transportName topic name or stream name etc.
+     * @param configs configurations for AWS Glue Schema Registry
+     */
+    public GlueSchemaRegistryAvroSchemaCoder(
+            final String transportName, final Map<String, Object> configs) {
+        glueSchemaRegistryInputStreamDeserializer =
+                new GlueSchemaRegistryInputStreamDeserializer(configs);
+        glueSchemaRegistryOutputStreamSerializer =
+                new GlueSchemaRegistryOutputStreamSerializer(transportName, configs);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryInputStreamDeserializer
+                    glueSchemaRegistryInputStreamDeserializer) {
+        this.glueSchemaRegistryInputStreamDeserializer = glueSchemaRegistryInputStreamDeserializer;
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryOutputStreamSerializer
+                    glueSchemaRegistryOutputStreamSerializer) {
+        this.glueSchemaRegistryOutputStreamSerializer = glueSchemaRegistryOutputStreamSerializer;
+    }
+
+    @Override
+    public Schema readSchema(InputStream in) throws IOException {
+        return glueSchemaRegistryInputStreamDeserializer.getSchemaAndDeserializedStream(in);
+    }
+
+    @Override
+    public void writeSchema(Schema schema, OutputStream out) throws IOException {
+        byte[] data = ((ByteArrayOutputStream) out).toByteArray();

Review comment:
       ok, please add the `Preconditions` check

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org/apache/flink/glue/schema/registry/test/GlueSchemaRegistryExample.java
##########
@@ -0,0 +1,110 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.java.utils.ParameterTool;
+import org.apache.flink.formats.avro.glue.schema.registry.GlueSchemaRegistryAvroDeserializationSchema;
+import org.apache.flink.formats.avro.glue.schema.registry.GlueSchemaRegistryAvroSerializationSchema;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer;
+import org.apache.flink.streaming.connectors.kinesis.FlinkKinesisProducer;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.kafka.test.base.KafkaExampleUtil;
+
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+
+import java.io.File;
+import java.io.IOException;
+import java.net.URL;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Properties;
+
+/**
+ * A simple example that shows how to read from and write to Kinesis. This will read Avro messages
+ * from the input topic, and finally write back to another topic.

Review comment:
       nit: Kinesis streams; topics are a Kafka thing

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org/apache/flink/glue/schema/registry/test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,182 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createStream(String stream, int shards, Properties props) throws Exception {
+        try {
+            kinesisClient.describeStream(stream);
+            kinesisClient.deleteStream(stream);
+        } catch (ResourceNotFoundException rnfe) {

Review comment:
       nit: Add a comment to say this is expected/ok. At a glance it looks like it is missing error handling

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,252 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.3</httpclient.version>
+		<httpcore.version>4.4.6</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>
+		<dependencies>
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>http-client-spi</artifactId>
+				<version>2.15.32</version>

Review comment:
       Update versions to use the property rather than hard coding




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r566133368



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,150 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test_${scala.binary.version}</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.9</httpclient.version>
+		<httpcore.version>4.4.11</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>

Review comment:
       nit: Consider updating, at the time of writing:
   - v1 @ 1.11.943
   - v2 @ 2.15.70

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,150 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test_${scala.binary.version}</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.9</httpclient.version>
+		<httpcore.version>4.4.11</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-kinesis-test_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro-glue-schema-registry</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>junit</groupId>
+			<artifactId>junit</artifactId>
+			<version>${junit.version}</version>
+			<scope>compile</scope>

Review comment:
       nit: you should not need `<version>` and `<scope>` tags here. I assume you meant to include junit as a compile scoped dependency (compile is default)

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,150 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test_${scala.binary.version}</artifactId>

Review comment:
       Correct me if I am wrong, but I do not think you need additional artifacts per scala version. Suggest dropping `_${scala.binary.version}`

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createTopic(String stream, int shards, Properties props) throws Exception {
+        try {
+            kinesisClient.describeStream(stream);
+            kinesisClient.deleteStream(stream);
+        } catch (ResourceNotFoundException rnfe) {
+            // expected when stream doesn't exist
+        }
+
+        kinesisClient.createStream(stream, shards);
+        Deadline deadline = Deadline.fromNow(Duration.ofSeconds(5));
+        while (deadline.hasTimeLeft()) {
+            try {
+                Thread.sleep(250); // sleep for a bit for stream to be created
+                if (kinesisClient.describeStream(stream).getStreamDescription().getShards().size()
+                        != shards) {
+                    // not fully created yet

Review comment:
       Consider removing these comments as per the "Golden Rule":
   - Golden rule: Comment as much as necessary to support code understanding, but don’t add redundant information.
   
   https://flink.apache.org/contributing/code-style-and-quality-common.html#comments-and-code-readability
   

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSerializationSchema.java
##########
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.RegistryAvroSerializationSchema;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.SneakyThrows;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.avro.io.Encoder;
+import org.apache.avro.specific.SpecificRecord;
+
+import javax.annotation.Nullable;
+
+import java.io.ByteArrayOutputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry Serialization schema to serialize to Avro binary format for Flink
+ * Producer user.
+ *
+ * @param <T> the type to be serialized
+ */
+public class GlueSchemaRegistryAvroSerializationSchema<T>
+        extends RegistryAvroSerializationSchema<T> {
+    /**
+     * Creates an Avro serialization schema.
+     *
+     * @param recordClazz class to serialize. Should be one of: {@link SpecificRecord}, {@link
+     *     GenericRecord}.
+     * @param reader reader's Avro schema. Should be provided if recordClazz is {@link
+     *     GenericRecord}
+     * @param schemaCoderProvider schema coder provider which reads writer schema from AWS Glue
+     *     Schema Registry
+     */
+    private GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz,
+            @Nullable Schema reader,
+            SchemaCoder.SchemaCoderProvider schemaCoderProvider) {
+        super(recordClazz, reader, schemaCoderProvider);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz, @Nullable Schema reader, SchemaCoder schemaCoder) {
+        // Pass null schema coder provider
+        super(recordClazz, reader, null);
+        this.schemaCoder = schemaCoder;
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * GenericRecord} using provided schema.
+     *
+     * @param schema the schema that will be used for serialization
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of AWS Glue Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static GlueSchemaRegistryAvroSerializationSchema<GenericRecord> forGeneric(
+            Schema schema, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                GenericRecord.class,
+                schema,
+                new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * SpecificRecord} using provided schema.
+     *
+     * @param clazz the type to be serialized
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of Amazon Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static <T extends SpecificRecord>
+            GlueSchemaRegistryAvroSerializationSchema<T> forSpecific(
+                    Class<T> clazz, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                clazz, null, new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Serializes the incoming element to a byte array containing bytes of AWS Glue Schema registry
+     * information.
+     *
+     * @param object The incoming element to be serialized
+     * @return The serialized bytes.
+     */
+    @SneakyThrows
+    @Override
+    public byte[] serialize(T object) {
+        checkAvroInitialized();
+
+        if (object == null) {
+            return null;
+        } else {
+            ByteArrayOutputStream outputStream = getOutputStream();
+            outputStream.reset();
+            Encoder encoder = getEncoder();
+            getDatumWriter().write(object, encoder);
+            schemaCoder.writeSchema(getSchema(), outputStream);
+            encoder.flush();
+
+            return outputStream.toByteArray();
+        }
+    }

Review comment:
       Is there a reason to override this? It looks almost identical to the version in `RegistryAvroSerializationSchema`

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       Is this actually required? Module still builds without it, usually provided by user application if required.

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createTopic(String stream, int shards, Properties props) throws Exception {
+        try {
+            kinesisClient.describeStream(stream);

Review comment:
       Why do you need to describeStream here?

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoder.java
##########
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import org.apache.avro.Schema;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.Map;
+
+/**
+ * Schema coder that allows reading schema that is somehow embedded into serialized record. Used by
+ * {@link GlueSchemaRegistryAvroDeserializationSchema} and {@link
+ * GlueSchemaRegistryAvroSerializationSchema}.
+ */
+public class GlueSchemaRegistryAvroSchemaCoder implements SchemaCoder {
+    private GlueSchemaRegistryInputStreamDeserializer glueSchemaRegistryInputStreamDeserializer;
+    private GlueSchemaRegistryOutputStreamSerializer glueSchemaRegistryOutputStreamSerializer;
+
+    /**
+     * Constructor accepts transport name and configuration map for AWS Glue Schema Registry.
+     *
+     * @param transportName topic name or stream name etc.
+     * @param configs configurations for AWS Glue Schema Registry
+     */
+    public GlueSchemaRegistryAvroSchemaCoder(
+            final String transportName, final Map<String, Object> configs) {
+        glueSchemaRegistryInputStreamDeserializer =
+                new GlueSchemaRegistryInputStreamDeserializer(configs);
+        glueSchemaRegistryOutputStreamSerializer =
+                new GlueSchemaRegistryOutputStreamSerializer(transportName, configs);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryInputStreamDeserializer
+                    glueSchemaRegistryInputStreamDeserializer) {
+        this.glueSchemaRegistryInputStreamDeserializer = glueSchemaRegistryInputStreamDeserializer;
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryOutputStreamSerializer
+                    glueSchemaRegistryOutputStreamSerializer) {
+        this.glueSchemaRegistryOutputStreamSerializer = glueSchemaRegistryOutputStreamSerializer;
+    }
+
+    @Override
+    public Schema readSchema(InputStream in) throws IOException {
+        return glueSchemaRegistryInputStreamDeserializer.getSchemaAndDeserializedStream(in);
+    }
+
+    @Override
+    public void writeSchema(Schema schema, OutputStream out) throws IOException {
+        byte[] data = ((ByteArrayOutputStream) out).toByteArray();

Review comment:
       Are you sure this will always be a `ByteArrayOutputStream`? Since this is a public method suggest a check:
   - `Preconditions.checkArgument(out instanceof ByteArrayOutputStream, "..");`

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>software.amazon.glue</groupId>
+			<artifactId>schema-registry-serde</artifactId>
+			<version>${glue.schema.registry.version}</version>
+		</dependency>
+
+		<!-- test dependencies -->
+
+		<dependency>
+			<groupId>org.junit.jupiter</groupId>
+			<artifactId>junit-jupiter-api</artifactId>
+			<version>${junit.jupiter.version}</version>
+			<scope>test</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.junit.jupiter</groupId>
+			<artifactId>junit-jupiter-params</artifactId>
+			<version>${junit.jupiter.version}</version>
+			<scope>test</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.mockito</groupId>
+			<artifactId>mockito-junit-jupiter</artifactId>
+			<version>${mockito.version}</version>
+			<scope>test</scope>
+		</dependency>

Review comment:
       Why are we not using the `junit` version provided by flink? Is there a reason to upgrade a single module? Are you sure results and reports would aggregate correctly on CI? I am worried we might miss or break something by using a different test runner for this module.

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createTopic(String stream, int shards, Properties props) throws Exception {

Review comment:
       `createStream`? I would consider updating the method name or add Javadoc to indicate that an existing stream will be deleted

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,150 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test_${scala.binary.version}</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.9</httpclient.version>
+		<httpcore.version>4.4.11</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-kinesis-test_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro-glue-schema-registry</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>junit</groupId>
+			<artifactId>junit</artifactId>
+			<version>${junit.version}</version>
+			<scope>compile</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<type>test-jar</type>
+			<scope>compile</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.httpcomponents</groupId>
+			<artifactId>httpclient</artifactId>
+			<version>${httpclient.version}</version>
+		</dependency>
+		<dependency>
+			<groupId>org.apache.httpcomponents</groupId>
+			<artifactId>httpcore</artifactId>
+			<version>${httpcore.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.httpcomponents</groupId>
+			<artifactId>httpcore</artifactId>
+			<version>${httpcore.version}</version>
+		</dependency>

Review comment:
       Duplicate dependency

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSerializationSchema.java
##########
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.RegistryAvroSerializationSchema;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.SneakyThrows;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.avro.io.Encoder;
+import org.apache.avro.specific.SpecificRecord;
+
+import javax.annotation.Nullable;
+
+import java.io.ByteArrayOutputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry Serialization schema to serialize to Avro binary format for Flink
+ * Producer user.
+ *
+ * @param <T> the type to be serialized
+ */
+public class GlueSchemaRegistryAvroSerializationSchema<T>
+        extends RegistryAvroSerializationSchema<T> {
+    /**
+     * Creates an Avro serialization schema.
+     *
+     * @param recordClazz class to serialize. Should be one of: {@link SpecificRecord}, {@link
+     *     GenericRecord}.
+     * @param reader reader's Avro schema. Should be provided if recordClazz is {@link
+     *     GenericRecord}
+     * @param schemaCoderProvider schema coder provider which reads writer schema from AWS Glue
+     *     Schema Registry
+     */
+    private GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz,
+            @Nullable Schema reader,
+            SchemaCoder.SchemaCoderProvider schemaCoderProvider) {
+        super(recordClazz, reader, schemaCoderProvider);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz, @Nullable Schema reader, SchemaCoder schemaCoder) {
+        // Pass null schema coder provider
+        super(recordClazz, reader, null);
+        this.schemaCoder = schemaCoder;
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * GenericRecord} using provided schema.
+     *
+     * @param schema the schema that will be used for serialization
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of AWS Glue Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static GlueSchemaRegistryAvroSerializationSchema<GenericRecord> forGeneric(
+            Schema schema, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                GenericRecord.class,
+                schema,
+                new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * SpecificRecord} using provided schema.
+     *
+     * @param clazz the type to be serialized
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of Amazon Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static <T extends SpecificRecord>
+            GlueSchemaRegistryAvroSerializationSchema<T> forSpecific(
+                    Class<T> clazz, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                clazz, null, new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Serializes the incoming element to a byte array containing bytes of AWS Glue Schema registry
+     * information.
+     *
+     * @param object The incoming element to be serialized
+     * @return The serialized bytes.
+     */
+    @SneakyThrows

Review comment:
       See parent implementation. Wrap or declare the exception rather than sneaky throws. 

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderProvider.java
##########
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.NonNull;

Review comment:
       I think wrong annotation 

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       This should probably be `<scope>provided</scope>`

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>

Review comment:
       I have taken a look at the dependency footprint of this module and it looks like there is too much pulled in:
   
   Why do we need Kafka dependencies?
   - `+- org.apache.kafka:connect-json:jar:2.5.0:compile`
   - `+- org.apache.kafka:connect-api:jar:2.5.0:compile`
   - `+- org.apache.kafka:kafka-streams:jar:2.5.0:compile`
   - `+- org.apache.kafka:kafka-clients:jar:2.5.0:compile`
   
   Pulling lombok as a compile dependency looks wrong, is this scoped correctly in upstream module?
   - `+- org.projectlombok:lombok:jar:1.18.2:compile`
   - `\- org.projectlombok:lombok-utils:jar:1.18.12:compile`
   
   As mentioned, we should use the standard junit framework for flink:
   - `+- org.junit.jupiter:junit-jupiter-api:jar:5.6.2:test`
   
   ```
   [INFO] --- maven-dependency-plugin:3.1.1:tree (default-cli) @ flink-avro-glue-schema-registry ---
   [INFO] org.apache.flink:flink-avro-glue-schema-registry:jar:1.13-SNAPSHOT
   [INFO] +- org.apache.flink:flink-core:jar:1.13-SNAPSHOT:provided
   [INFO] |  +- org.apache.flink:flink-annotations:jar:1.13-SNAPSHOT:provided
   [INFO] |  +- org.apache.flink:flink-metrics-core:jar:1.13-SNAPSHOT:provided
   [INFO] |  +- org.apache.flink:flink-shaded-asm-7:jar:7.1-12.0:provided
   [INFO] |  +- org.apache.commons:commons-lang3:jar:3.3.2:compile
   [INFO] |  +- com.esotericsoftware.kryo:kryo:jar:2.24.0:provided
   [INFO] |  |  \- com.esotericsoftware.minlog:minlog:jar:1.2:provided
   [INFO] |  +- commons-collections:commons-collections:jar:3.2.2:provided
   [INFO] |  +- org.apache.commons:commons-compress:jar:1.20:compile
   [INFO] |  \- org.apache.flink:flink-shaded-guava:jar:18.0-12.0:compile
   [INFO] +- org.apache.flink:flink-avro:jar:1.13-SNAPSHOT:compile
   [INFO] |  \- org.apache.avro:avro:jar:1.10.0:compile
   [INFO] |     +- com.fasterxml.jackson.core:jackson-core:jar:2.12.1:compile
   [INFO] |     \- com.fasterxml.jackson.core:jackson-databind:jar:2.12.1:compile
   [INFO] |        \- com.fasterxml.jackson.core:jackson-annotations:jar:2.12.1:compile
   [INFO] +- org.apache.flink:flink-streaming-java_2.11:jar:1.13-SNAPSHOT:compile
   [INFO] |  +- org.apache.flink:flink-file-sink-common:jar:1.13-SNAPSHOT:compile
   [INFO] |  +- org.apache.flink:flink-runtime_2.11:jar:1.13-SNAPSHOT:compile
   [INFO] |  |  +- org.apache.flink:flink-queryable-state-client-java:jar:1.13-SNAPSHOT:compile
   [INFO] |  |  +- org.apache.flink:flink-hadoop-fs:jar:1.13-SNAPSHOT:compile
   [INFO] |  |  +- commons-io:commons-io:jar:2.7:compile
   [INFO] |  |  +- org.apache.flink:flink-shaded-netty:jar:4.1.49.Final-12.0:compile
   [INFO] |  |  +- org.apache.flink:flink-shaded-jackson:jar:2.10.1-12.0:compile
   [INFO] |  |  +- org.apache.flink:flink-shaded-zookeeper-3:jar:3.4.14-12.0:compile
   [INFO] |  |  +- org.javassist:javassist:jar:3.24.0-GA:compile
   [INFO] |  |  +- org.scala-lang:scala-library:jar:2.11.12:compile
   [INFO] |  |  +- com.typesafe.akka:akka-actor_2.11:jar:2.5.21:compile
   [INFO] |  |  |  +- com.typesafe:config:jar:1.3.0:compile
   [INFO] |  |  |  \- org.scala-lang.modules:scala-java8-compat_2.11:jar:0.7.0:compile
   [INFO] |  |  +- com.typesafe.akka:akka-stream_2.11:jar:2.5.21:compile
   [INFO] |  |  |  +- org.reactivestreams:reactive-streams:jar:1.0.2:compile
   [INFO] |  |  |  \- com.typesafe:ssl-config-core_2.11:jar:0.3.7:compile
   [INFO] |  |  |     \- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.1.1:compile
   [INFO] |  |  +- com.typesafe.akka:akka-protobuf_2.11:jar:2.5.21:compile
   [INFO] |  |  +- com.typesafe.akka:akka-slf4j_2.11:jar:2.5.21:compile
   [INFO] |  |  +- org.clapper:grizzled-slf4j_2.11:jar:1.3.2:compile
   [INFO] |  |  +- com.github.scopt:scopt_2.11:jar:3.5.0:compile
   [INFO] |  |  +- org.xerial.snappy:snappy-java:jar:1.1.4:compile
   [INFO] |  |  +- com.twitter:chill_2.11:jar:0.7.6:compile
   [INFO] |  |  |  \- com.twitter:chill-java:jar:0.7.6:compile
   [INFO] |  |  \- org.lz4:lz4-java:jar:1.6.0:compile
   [INFO] |  +- org.apache.flink:flink-java:jar:1.13-SNAPSHOT:compile
   [INFO] |  \- org.apache.commons:commons-math3:jar:3.5:compile
   [INFO] +- org.apache.flink:flink-clients_2.11:jar:1.13-SNAPSHOT:compile
   [INFO] |  +- org.apache.flink:flink-optimizer_2.11:jar:1.13-SNAPSHOT:compile
   [INFO] |  \- commons-cli:commons-cli:jar:1.3.1:compile
   [INFO] +- software.amazon.glue:schema-registry-serde:jar:1.0.0:compile
   [INFO] |  +- software.amazon.glue:schema-registry-common:jar:1.0.0:compile
   [INFO] |  |  +- software.amazon.awssdk:glue:jar:2.15.32:compile
   [INFO] |  |  |  +- software.amazon.awssdk:protocol-core:jar:2.15.32:compile
   [INFO] |  |  |  +- software.amazon.awssdk:auth:jar:2.15.32:compile
   [INFO] |  |  |  |  \- software.amazon.eventstream:eventstream:jar:1.0.1:compile
   [INFO] |  |  |  +- software.amazon.awssdk:http-client-spi:jar:2.15.32:compile
   [INFO] |  |  |  +- software.amazon.awssdk:regions:jar:2.15.32:compile
   [INFO] |  |  |  +- software.amazon.awssdk:aws-core:jar:2.15.32:compile
   [INFO] |  |  |  +- software.amazon.awssdk:metrics-spi:jar:2.15.32:compile
   [INFO] |  |  |  +- software.amazon.awssdk:apache-client:jar:2.15.32:runtime
   [INFO] |  |  |  |  +- org.apache.httpcomponents:httpclient:jar:4.5.3:runtime
   [INFO] |  |  |  |  |  +- commons-logging:commons-logging:jar:1.1.3:runtime
   [INFO] |  |  |  |  |  \- commons-codec:commons-codec:jar:1.13:runtime
   [INFO] |  |  |  |  \- org.apache.httpcomponents:httpcore:jar:4.4.6:runtime
   [INFO] |  |  |  \- software.amazon.awssdk:netty-nio-client:jar:2.15.32:runtime
   [INFO] |  |  |     +- io.netty:netty-codec-http:jar:4.1.53.Final:runtime
   [INFO] |  |  |     +- io.netty:netty-codec-http2:jar:4.1.53.Final:runtime
   [INFO] |  |  |     +- io.netty:netty-codec:jar:4.1.53.Final:runtime
   [INFO] |  |  |     +- io.netty:netty-transport:jar:4.1.53.Final:runtime
   [INFO] |  |  |     |  \- io.netty:netty-resolver:jar:4.1.53.Final:runtime
   [INFO] |  |  |     +- io.netty:netty-common:jar:4.1.53.Final:runtime
   [INFO] |  |  |     +- io.netty:netty-buffer:jar:4.1.53.Final:runtime
   [INFO] |  |  |     +- io.netty:netty-handler:jar:4.1.53.Final:runtime
   [INFO] |  |  |     +- io.netty:netty-transport-native-epoll:jar:linux-x86_64:4.1.53.Final:runtime
   [INFO] |  |  |     |  \- io.netty:netty-transport-native-unix-common:jar:4.1.53.Final:runtime
   [INFO] |  |  |     \- com.typesafe.netty:netty-reactive-streams-http:jar:2.0.4:runtime
   [INFO] |  |  |        \- com.typesafe.netty:netty-reactive-streams:jar:2.0.4:runtime
   [INFO] |  |  +- software.amazon.awssdk:aws-json-protocol:jar:2.15.30:compile
   [INFO] |  |  +- software.amazon.awssdk:cloudwatch:jar:2.15.30:compile
   [INFO] |  |  |  \- software.amazon.awssdk:aws-query-protocol:jar:2.15.30:compile
   [INFO] |  |  +- software.amazon.awssdk:sdk-core:jar:2.15.30:compile
   [INFO] |  |  |  \- software.amazon.awssdk:profiles:jar:2.15.30:compile
   [INFO] |  |  +- org.apache.kafka:kafka-clients:jar:2.5.0:compile
   [INFO] |  |  |  \- com.github.luben:zstd-jni:jar:1.4.4-7:compile
   [INFO] |  |  +- org.apache.kafka:kafka-streams:jar:2.5.0:compile
   [INFO] |  |  |  +- org.apache.kafka:connect-json:jar:2.5.0:compile
   [INFO] |  |  |  |  +- org.apache.kafka:connect-api:jar:2.5.0:compile
   [INFO] |  |  |  |  \- com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar:2.12.1:compile
   [INFO] |  |  |  \- org.rocksdb:rocksdbjni:jar:5.18.3:compile
   [INFO] |  |  \- com.google.guava:guava:jar:29.0-jre:compile
   [INFO] |  |     +- com.google.guava:failureaccess:jar:1.0.1:compile
   [INFO] |  |     +- com.google.guava:listenablefuture:jar:9999.0-empty-to-avoid-conflict-with-guava:compile
   [INFO] |  |     +- org.checkerframework:checker-qual:jar:2.11.1:compile
   [INFO] |  |     +- com.google.errorprone:error_prone_annotations:jar:2.3.4:compile
   [INFO] |  |     \- com.google.j2objc:j2objc-annotations:jar:1.3:compile
   [INFO] |  +- software.amazon.awssdk:arns:jar:2.15.26:compile
   [INFO] |  |  +- software.amazon.awssdk:annotations:jar:2.15.26:compile
   [INFO] |  |  \- software.amazon.awssdk:utils:jar:2.15.26:compile
   [INFO] |  +- org.projectlombok:lombok:jar:1.18.2:compile
   [INFO] |  \- org.projectlombok:lombok-utils:jar:1.18.12:compile
   [INFO] +- org.junit.jupiter:junit-jupiter-api:jar:5.6.2:test
   [INFO] |  +- org.apiguardian:apiguardian-api:jar:1.1.0:test
   [INFO] |  +- org.opentest4j:opentest4j:jar:1.2.0:test
   [INFO] |  \- org.junit.platform:junit-platform-commons:jar:1.6.2:test
   [INFO] +- org.junit.jupiter:junit-jupiter-params:jar:5.6.2:test
   [INFO] +- org.mockito:mockito-junit-jupiter:jar:2.21.0:test
   [INFO] +- org.slf4j:slf4j-api:jar:1.7.15:provided
   [INFO] +- org.apache.flink:flink-test-utils-junit:jar:1.13-SNAPSHOT:test
   [INFO] +- org.apache.flink:force-shading:jar:1.13-SNAPSHOT:compile
   [INFO] +- com.google.code.findbugs:jsr305:jar:1.3.9:compile
   [INFO] +- junit:junit:jar:4.12:test
   [INFO] |  \- org.hamcrest:hamcrest-core:jar:1.3:test
   [INFO] +- org.mockito:mockito-core:jar:2.21.0:test
   [INFO] |  +- net.bytebuddy:byte-buddy:jar:1.8.15:test
   [INFO] |  +- net.bytebuddy:byte-buddy-agent:jar:1.8.15:test
   [INFO] |  \- org.objenesis:objenesis:jar:2.1:provided
   [INFO] +- org.powermock:powermock-module-junit4:jar:2.0.4:test
   [INFO] |  \- org.powermock:powermock-module-junit4-common:jar:2.0.4:test
   [INFO] |     +- org.powermock:powermock-reflect:jar:2.0.4:test
   [INFO] |     \- org.powermock:powermock-core:jar:2.0.4:test
   [INFO] +- org.powermock:powermock-api-mockito2:jar:2.0.4:test
   [INFO] |  \- org.powermock:powermock-api-support:jar:2.0.4:test
   [INFO] +- org.hamcrest:hamcrest-all:jar:1.3:test
   [INFO] +- org.apache.logging.log4j:log4j-slf4j-impl:jar:2.12.1:test
   [INFO] +- org.apache.logging.log4j:log4j-api:jar:2.12.1:test
   [INFO] +- org.apache.logging.log4j:log4j-core:jar:2.12.1:test
   [INFO] \- org.apache.logging.log4j:log4j-1.2-api:jar:2.12.1:test
   ```
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 0114bac886e3c6f954632211f9a0e2f81998d1ac Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296) 
   * 97864e82641279ac227eda19b938ebce62262867 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793552614


   > What we know is that this approach works well for the S3 tests.
   
   Okay, I got it. Theoretically, variables transformation should work. Then the problem should still come from condition check. I've put the following code back to package in the latest commit. 
   ```
   SECRET_S3_ACCESS_KEY: $[variables.IT_CASE_S3_ACCESS_KEY]
   SECRET_S3_SECRET_KEY: $[variables.IT_CASE_S3_SECRET_KEY]
   ```
   Meanwhile, I changed
   ```
   if [ -n "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ -n "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
     run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   fi
   ```
   to
   ```
   if [[ -n "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ]] && [[ -n "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]]; then
     run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   fi
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112) Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * a87a26aa218658f7098367cae8c7a2ed18430296 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215) 
   * 0114bac886e3c6f954632211f9a0e2f81998d1ac Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793577861


   > Why the s3 access key is need to be added?
   
   Because some S3 end to end tests work against the real S3 service, not some mock or "fake" service.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577485121



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createTopic(String stream, int shards, Properties props) throws Exception {
+        try {
+            kinesisClient.describeStream(stream);

Review comment:
       It's referred from Kinesis Connector e2e test. I think it describes the stream with same name and delete it to make the stream with latest data.

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createTopic(String stream, int shards, Properties props) throws Exception {
+        try {
+            kinesisClient.describeStream(stream);
+            kinesisClient.deleteStream(stream);
+        } catch (ResourceNotFoundException rnfe) {
+            // expected when stream doesn't exist
+        }
+
+        kinesisClient.createStream(stream, shards);
+        Deadline deadline = Deadline.fromNow(Duration.ofSeconds(5));
+        while (deadline.hasTimeLeft()) {
+            try {
+                Thread.sleep(250); // sleep for a bit for stream to be created
+                if (kinesisClient.describeStream(stream).getStreamDescription().getShards().size()
+                        != shards) {
+                    // not fully created yet

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395) 
   * 86353afe862e41b51dc48caf19f48fc03e6246b0 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 56566c305fba167267cf427dddbf33c62b04f997 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 399f06e14079a512d35814508b6f7598d7d175ba Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417) 
   * bd617892bfec1db1654606355041a6e4b9050304 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577558096



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>

Review comment:
       The enforcer check skip property has been removed. Currently, version of all dependencies with convergence error have been specifically defined in `Flink-GSR` module and its e2e test module. Once the new version of GSR package with reorganized dependencies is released to maven, the version definition can be removed.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-792941032


   > https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8949&view=results
   
   Hi Robert, can I check with you how's the secret forwarding working and would it affect the env variable of main CI?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577986479



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>

Review comment:
       ok, so we have removed the dependency on Kafka in the new version? What is the ECD for the new version?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r587207858



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/User.java
##########
@@ -0,0 +1,434 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.avro.message.BinaryMessageDecoder;
+import org.apache.avro.message.BinaryMessageEncoder;
+import org.apache.avro.message.SchemaStore;
+import org.apache.avro.specific.SpecificData;
+
+@SuppressWarnings("all")
+@org.apache.avro.specific.AvroGenerated
+public class User extends org.apache.avro.specific.SpecificRecordBase

Review comment:
       I'm okay with addressing this in a follow up PR if you prefer.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-790375946


   I pushed this PR to my personal azure, where the CI credentials are provided: https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8929&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * bafd3a9a41461850057e9f2d50c3cea1c52aad7a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340) 
   * c4eb439a79d18f8296055e3582a0093146cbacc7 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer merged pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer merged pull request #14737:
URL: https://github.com/apache/flink/pull/14737


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 399f06e14079a512d35814508b6f7598d7d175ba Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395) 
   * 86353afe862e41b51dc48caf19f48fc03e6246b0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397) 
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793624549


   > SECRET_GLUE_SCHEMA_ACCESS_KEY: $[variables.IT_CASE_GLUE_SCHEMA_ACCESS_KEY]
   >   SECRET_GLUE_SCHEMA_SECRET_KEY: $[variables.IT_CASE_GLUE_SCHEMA_SECRET_KEY]
   
   Added


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577883615



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,252 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.3</httpclient.version>
+		<httpcore.version>4.4.6</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>

Review comment:
       dependency management sections can be useful even in the absence of child modules because you can modify versions of transitive dependencies without adding an explicit dependency, which would change the dependency tree structure and obfuscate where the dependency actually comes from.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395) 
   * 86353afe862e41b51dc48caf19f48fc03e6246b0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397) 
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 81adc9cccd9fc0247e58ad7688252e1d98382cc5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-790878304


   yes, rebasing to the latest master should fix the issue!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r573954198



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSerializationSchema.java
##########
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.RegistryAvroSerializationSchema;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.SneakyThrows;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.avro.io.Encoder;
+import org.apache.avro.specific.SpecificRecord;
+
+import javax.annotation.Nullable;
+
+import java.io.ByteArrayOutputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry Serialization schema to serialize to Avro binary format for Flink
+ * Producer user.
+ *
+ * @param <T> the type to be serialized
+ */
+public class GlueSchemaRegistryAvroSerializationSchema<T>
+        extends RegistryAvroSerializationSchema<T> {
+    /**
+     * Creates an Avro serialization schema.
+     *
+     * @param recordClazz class to serialize. Should be one of: {@link SpecificRecord}, {@link
+     *     GenericRecord}.
+     * @param reader reader's Avro schema. Should be provided if recordClazz is {@link
+     *     GenericRecord}
+     * @param schemaCoderProvider schema coder provider which reads writer schema from AWS Glue
+     *     Schema Registry
+     */
+    private GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz,
+            @Nullable Schema reader,
+            SchemaCoder.SchemaCoderProvider schemaCoderProvider) {
+        super(recordClazz, reader, schemaCoderProvider);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz, @Nullable Schema reader, SchemaCoder schemaCoder) {
+        // Pass null schema coder provider
+        super(recordClazz, reader, null);
+        this.schemaCoder = schemaCoder;
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * GenericRecord} using provided schema.
+     *
+     * @param schema the schema that will be used for serialization
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of AWS Glue Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static GlueSchemaRegistryAvroSerializationSchema<GenericRecord> forGeneric(
+            Schema schema, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                GenericRecord.class,
+                schema,
+                new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * SpecificRecord} using provided schema.
+     *
+     * @param clazz the type to be serialized
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of Amazon Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static <T extends SpecificRecord>
+            GlueSchemaRegistryAvroSerializationSchema<T> forSpecific(
+                    Class<T> clazz, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                clazz, null, new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Serializes the incoming element to a byte array containing bytes of AWS Glue Schema registry
+     * information.
+     *
+     * @param object The incoming element to be serialized
+     * @return The serialized bytes.
+     */
+    @SneakyThrows
+    @Override
+    public byte[] serialize(T object) {
+        checkAvroInitialized();
+
+        if (object == null) {
+            return null;
+        } else {
+            ByteArrayOutputStream outputStream = getOutputStream();
+            outputStream.reset();
+            Encoder encoder = getEncoder();
+            getDatumWriter().write(object, encoder);
+            schemaCoder.writeSchema(getSchema(), outputStream);
+            encoder.flush();
+
+            return outputStream.toByteArray();
+        }
+    }

Review comment:
       Yes, the reason to override this method is that it's required to add header byte and compression byte in the serialized byte array. So we change the order of writing schema and writing object. And we have extra methods in _writeSchema_ to achieve this feature.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-791200406


   Both CIs are green, however, the e2e test didn't execute on my CI. I'll quickly try to fix that.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * f793772656ad942f463a23b0dd43f3522f147493 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794169266


   Hi Robert, main CI passed now. Would you please bring the latest commit to your personal CI to verify it and then we can close this PR?
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794240830


   I also need to set this up. I was planning on taking a look tomorrow morning. Let me try to setup my personal Azure and verify your change. I will update you tomorrow.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577883615



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,252 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.3</httpclient.version>
+		<httpcore.version>4.4.6</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>

Review comment:
       dependency management sections can be useful even in the absence of child modules because you can modify versions of transitive dependencies without adding an explicit dependency, which would change the dependency and obfuscate where the dependency actually comes from.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r583545398



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,259 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.9</httpclient.version>
+		<httpcore.version>4.4.11</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+		<reactivestreams.version>1.0.2</reactivestreams.version>
+		<lz4.version>1.6.0</lz4.version>
+		<netty.version>4.1.53.Final</netty.version>
+		<guava.version>29.0-jre</guava.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>
+		<dependencies>
+			<!-- dependencies to solve enforcer check issue -->
+			<!-- can be removed once new version of Glue Schema Registry releases -->
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>http-client-spi</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>aws-core</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>protocol-core</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>annotations</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>utils</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>apache-client</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.reactivestreams</groupId>
+				<artifactId>reactive-streams</artifactId>
+				<version>${reactivestreams.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.lz4</groupId>
+				<artifactId>lz4-java</artifactId>
+				<version>${lz4.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>aws-json-protocol</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>regions</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>sdk-core</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>io.netty</groupId>
+				<artifactId>netty-codec-http</artifactId>
+				<version>${netty.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>netty-nio-client</artifactId>
+				<version>${aws.sdkv2.version}</version>
+				<scope>runtime</scope>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>auth</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>metrics-spi</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>io.netty</groupId>
+				<artifactId>netty-handler</artifactId>
+				<version>${netty.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>com.google.guava</groupId>
+				<artifactId>guava</artifactId>
+				<version>${guava.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.apache.httpcomponents</groupId>
+				<artifactId>httpclient</artifactId>
+				<version>${httpclient.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.apache.httpcomponents</groupId>
+				<artifactId>httpcore</artifactId>
+				<version>${httpcore.version}</version>
+			</dependency>
+		</dependencies>
+	</dependencyManagement>
+
+	<dependencies>
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-kinesis-test_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro-glue-schema-registry</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>junit</groupId>
+			<artifactId>junit</artifactId>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<type>test-jar</type>
+		</dependency>
+
+		<dependency>
+			<groupId>software.amazon.awssdk</groupId>
+			<artifactId>sdk-core</artifactId>
+			<version>${aws.sdkv2.version}</version>
+		</dependency>
+		<dependency>
+			<groupId>com.amazonaws</groupId>
+			<artifactId>aws-java-sdk-kinesis</artifactId>
+			<version>${aws.sdk.version}</version>
+		</dependency>
+	</dependencies>
+
+	<build>
+		<plugins>
+			<!-- Use the shade plugin to build a fat jar for the Kinesis connector test -->
+			<plugin>
+				<groupId>org.apache.maven.plugins</groupId>
+				<artifactId>maven-shade-plugin</artifactId>

Review comment:
       @rmetzger please confirm whether this module requires a `NOTICE` for bundled dependencies? I am not sure if this rule applies to the tests since they are not shipped to maven repo. Also equivalent elasticsearch modules that shade do not have one: 
   - https://github.com/apache/flink/blob/master/flink-end-to-end-tests/flink-elasticsearch5-test/pom.xml#L55




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * a87a26aa218658f7098367cae8c7a2ed18430296 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215) 
   * 0114bac886e3c6f954632211f9a0e2f81998d1ac UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r594507106



##########
File path: flink-end-to-end-tests/test-scripts/common.sh
##########
@@ -364,6 +364,7 @@ function check_logs_for_errors {
       | grep -v "HeapDumpOnOutOfMemoryError" \
       | grep -v "error_prone_annotations" \
       | grep -v "Error sending fetch request" \
+      | grep -v "WARN  akka.remote.ReliableDeliverySupervisor" \

Review comment:
       > @LinyuYao1021 Do we still need this change?
   
   Yes, it skips the `akka` exception check for logs.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577985579



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,252 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.3</httpclient.version>
+		<httpcore.version>4.4.6</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>

Review comment:
       I did not know that, thanks! That sounds like a much better approach than adding explicit dependency 👍 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-796748426


   Thanks @rmetzger. GSR branch running: 
   - https://dev.azure.com/georgeryan1322/Flink/_build/results?buildId=351&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-781667422


   As a follow up we should add support for SQL client and Table API by:
   - Adding a shaded module for SQL client, similar to [flink-sql-avro-confluent-registry](https://github.com/apache/flink/tree/master/flink-formats/flink-sql-avro-confluent-registry)
   - Create a [FormatFactory](https://github.com/apache/flink/blob/master/flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactory.java)
   - Add a [service](https://github.com/apache/flink/blob/master/flink-formats/flink-avro-confluent-registry/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148) Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766991327


   > What's the relationship of this PR to #14490 ?
   
   It's the same one but fixing the compiling error.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14504",
       "triggerID" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "288129b58edfbe6928104e517e3e7243e4b6e0d2",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14510",
       "triggerID" : "288129b58edfbe6928104e517e3e7243e4b6e0d2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 288129b58edfbe6928104e517e3e7243e4b6e0d2 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14510) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-790381549


   Thanks for your review @dannycranmer!
   
   Before we can merge this PR, we need to make sure that CI is passing without credentials (this is the case in the PR CI validation), and with credentials (my personal CI).


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577888627



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryInputStreamDeserializerTest.java
##########
@@ -0,0 +1,282 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.utils.MutableByteArrayInputStream;
+
+import com.amazonaws.services.schemaregistry.common.AWSCompressionHandler;
+import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryDefaultCompression;
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import lombok.NonNull;
+import org.apache.avro.Schema;
+import org.apache.avro.io.BinaryEncoder;
+import org.apache.avro.io.DatumWriter;
+import org.apache.avro.io.EncoderFactory;
+import org.apache.avro.specific.SpecificDatumWriter;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.nio.ByteBuffer;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.equalTo;
+import static org.hamcrest.Matchers.instanceOf;
+
+/** Tests for {@link GlueSchemaRegistryInputStreamDeserializer}. */
+public class GlueSchemaRegistryInputStreamDeserializerTest {
+    private static final String testTopic = "Test-Topic";
+    private static final UUID USER_SCHEMA_VERSION_ID = UUID.randomUUID();
+    private static final String AVRO_USER_SCHEMA_FILE = "src/test/java/resources/avro/user.avsc";
+    private static byte compressionByte;
+    private static Schema userSchema;
+    private static com.amazonaws.services.schemaregistry.common.Schema glueSchema;
+    private static User userDefinedPojo;
+    private static Map<String, Object> configs = new HashMap<>();
+    private static Map<String, String> metadata = new HashMap<>();
+    private static AWSCompressionHandler awsCompressionHandler;
+    private static AwsCredentialsProvider credentialsProvider =
+            DefaultCredentialsProvider.builder().build();
+    @Rule public ExpectedException thrown = ExpectedException.none();
+    private AWSDeserializer mockDeserializer;
+
+    @Before

Review comment:
       Have you considered using `@BeforeClass` instead?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 399f06e14079a512d35814508b6f7598d7d175ba UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 56566c305fba167267cf427dddbf33c62b04f997 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426) 
   * 061978aeb474636d954f85b0408b00a21f2571e5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577887598



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryOutputStreamSerializerTest.java
##########
@@ -0,0 +1,132 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import org.apache.avro.Schema;
+import org.junit.Before;
+import org.junit.Test;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.instanceOf;
+
+/** Tests for {@link GlueSchemaRegistryOutputStreamSerializer}. */
+public class GlueSchemaRegistryOutputStreamSerializerTest {

Review comment:
       It is common for all test class to extend `TestLogger`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14504",
       "triggerID" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "288129b58edfbe6928104e517e3e7243e4b6e0d2",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "288129b58edfbe6928104e517e3e7243e4b6e0d2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * bafd3a9a41461850057e9f2d50c3cea1c52aad7a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340) 
   * c4eb439a79d18f8296055e3582a0093146cbacc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14504) 
   * 288129b58edfbe6928104e517e3e7243e4b6e0d2 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794169266


   Hi Robert, main CI passed now. Would you please bring the latest commit to your personal CI to verify it and then we can close this PR? @rmetzger 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * f793772656ad942f463a23b0dd43f3522f147493 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953) 
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577390289



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderTest.java
##########
@@ -0,0 +1,287 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import com.amazonaws.services.schemaregistry.caching.AWSSchemaRegistrySerializerCache;
+import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient;
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import org.apache.avro.Schema;
+import org.hamcrest.Matchers;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.EnumSource;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.services.glue.model.EntityNotFoundException;
+
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.io.InputStream;
+import java.lang.reflect.Field;
+import java.nio.ByteBuffer;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.CoreMatchers.notNullValue;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.junit.jupiter.api.Assertions.assertThrows;
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.ArgumentMatchers.anyMap;
+import static org.mockito.ArgumentMatchers.anyString;
+import static org.mockito.Mockito.doCallRealMethod;
+import static org.mockito.Mockito.spy;
+import static org.mockito.Mockito.when;

Review comment:
       Fixed

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,150 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test_${scala.binary.version}</artifactId>

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577781307



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>

Review comment:
       Removed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 0114bac886e3c6f954632211f9a0e2f81998d1ac Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296) 
   * 97864e82641279ac227eda19b938ebce62262867 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r585024774



##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,78 @@
+#!/usr/bin/env bash
+################################################################################
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+################################################################################
+# To run this test locally, AWS credential is required.

Review comment:
       Test passes with the latest push, thanks
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-795179693


   This is my pipeline, directly to run my commit to see what's the problem:
   
   -https://dev.azure.com/yaolinyu3547/PrivateFlink/_build/results?buildId=5&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-780601429


   I am seeing a test failure when running `mvn clean install` on the `flink-avro-glue-schema-registry`:
   
   ```
   [ERROR] Errors:
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [ERROR]   GlueSchemaRegistryAvroSerializationSchemaTest.<init>:58 » IllegalArgument glue...
   [INFO]
   [ERROR] Tests run: 19, Failures: 0, Errors: 5, Skipped: 0
   ```
   
   Looks like `glueSchemaRegistryConfiguration` is `null`


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * bafd3a9a41461850057e9f2d50c3cea1c52aad7a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148) Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112) 
   * c77e3d4a1a7484c4f1e24b23a1099364b834cf75 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 56566c305fba167267cf427dddbf33c62b04f997 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426) 
   * 061978aeb474636d954f85b0408b00a21f2571e5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577390089



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GlueSchemaRegistryExampleTest.java
##########
@@ -0,0 +1,146 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.api.java.utils.ParameterTool;
+
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericData;
+import org.apache.avro.generic.GenericRecord;
+import org.junit.Assert;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.concurrent.atomic.AtomicReference;
+
+/** Test driver for {@link GlueSchemaRegistryExample#main}. */
+public class GlueSchemaRegistryExampleTest {
+    private static final Logger LOG = LoggerFactory.getLogger(GlueSchemaRegistryExampleTest.class);
+
+    public static void main(String[] args) throws Exception {
+        LOG.info("System properties: {}", System.getProperties());
+        final ParameterTool parameterTool = ParameterTool.fromArgs(args);
+
+        String inputStream = parameterTool.getRequired("input-stream");
+        String outputStream = parameterTool.getRequired("output-stream");
+
+        GSRKinesisPubsubClient pubsub = new GSRKinesisPubsubClient(parameterTool.getProperties());
+        pubsub.createTopic(inputStream, 2, parameterTool.getProperties());
+        pubsub.createTopic(outputStream, 2, parameterTool.getProperties());
+
+        // The example job needs to start after streams are created and run in parallel to the
+        // validation logic.
+        // The thread that runs the job won't terminate, we don't have a job reference to cancel it.
+        // Once results are validated, the driver main thread will exit; job/cluster will be
+        // terminated from script.
+        final AtomicReference<Exception> executeException = new AtomicReference<>();
+        Thread executeThread =
+                new Thread(
+                        () -> {
+                            try {
+                                GlueSchemaRegistryExample.main(args);
+                                // this message won't appear in the log,
+                                // job is terminated when shutting down cluster
+                                LOG.info("executed program");
+                            } catch (Exception e) {
+                                executeException.set(e);
+                            }
+                        });
+        executeThread.start();
+
+        List<GenericRecord> messages = getRecords();
+        for (GenericRecord msg : messages) {
+            pubsub.sendMessage(GlueSchemaRegistryExample.getSchema().toString(), inputStream, msg);
+        }
+        LOG.info("generated records");
+
+        Deadline deadline = Deadline.fromNow(Duration.ofSeconds(60));
+        List<Object> results = pubsub.readAllMessages(outputStream);
+        while (deadline.hasTimeLeft()
+                && executeException.get() == null
+                && results.size() < messages.size()) {
+            LOG.info("waiting for results..");
+            Thread.sleep(1000);
+            results = pubsub.readAllMessages(outputStream);
+        }
+
+        if (executeException.get() != null) {
+            throw executeException.get();
+        }
+
+        LOG.info("results: {}", results);
+        Assert.assertEquals(
+                "Results received from '" + outputStream + "': " + results,
+                messages.size(),
+                results.size());
+
+        List<GenericRecord> expectedResults = getRecords();
+
+        for (Object expectedResult : expectedResults) {
+            Assert.assertTrue(results.contains(expectedResult));
+        }
+
+        // TODO: main thread needs to create job or CLI fails with:
+        // "The program didn't contain a Flink job. Perhaps you forgot to call execute() on the
+        // execution environment."
+        System.out.println("test finished");

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577139233



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoder.java
##########
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import org.apache.avro.Schema;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.Map;
+
+/**
+ * Schema coder that allows reading schema that is somehow embedded into serialized record. Used by
+ * {@link GlueSchemaRegistryAvroDeserializationSchema} and {@link
+ * GlueSchemaRegistryAvroSerializationSchema}.
+ */
+public class GlueSchemaRegistryAvroSchemaCoder implements SchemaCoder {
+    private GlueSchemaRegistryInputStreamDeserializer glueSchemaRegistryInputStreamDeserializer;
+    private GlueSchemaRegistryOutputStreamSerializer glueSchemaRegistryOutputStreamSerializer;
+
+    /**
+     * Constructor accepts transport name and configuration map for AWS Glue Schema Registry.
+     *
+     * @param transportName topic name or stream name etc.
+     * @param configs configurations for AWS Glue Schema Registry
+     */
+    public GlueSchemaRegistryAvroSchemaCoder(
+            final String transportName, final Map<String, Object> configs) {
+        glueSchemaRegistryInputStreamDeserializer =
+                new GlueSchemaRegistryInputStreamDeserializer(configs);
+        glueSchemaRegistryOutputStreamSerializer =
+                new GlueSchemaRegistryOutputStreamSerializer(transportName, configs);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryInputStreamDeserializer
+                    glueSchemaRegistryInputStreamDeserializer) {
+        this.glueSchemaRegistryInputStreamDeserializer = glueSchemaRegistryInputStreamDeserializer;
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSchemaCoder(
+            final GlueSchemaRegistryOutputStreamSerializer
+                    glueSchemaRegistryOutputStreamSerializer) {
+        this.glueSchemaRegistryOutputStreamSerializer = glueSchemaRegistryOutputStreamSerializer;
+    }
+
+    @Override
+    public Schema readSchema(InputStream in) throws IOException {
+        return glueSchemaRegistryInputStreamDeserializer.getSchemaAndDeserializedStream(in);
+    }
+
+    @Override
+    public void writeSchema(Schema schema, OutputStream out) throws IOException {
+        byte[] data = ((ByteArrayOutputStream) out).toByteArray();

Review comment:
       It's always `ByteArrayOutputStream` since the stream is from `GlueSchemaRegistrySerializationSchema` and it's defined as `ByteArrayOutputStream` in super class  `AvroSerializationSchema`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 97864e82641279ac227eda19b938ebce62262867 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316) 
   * 5779d8d7942ed9d747a4f07043c3fad3d1ff82f0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331) 
   * cb96570901cbca2a6c9fdefc98c4154839194fc1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578016466



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org/apache/flink/glue/schema/registry/test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,182 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.GetRecordsResult;
+import org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.model.Record;
+import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
+import org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle;
+import org.apache.flink.streaming.connectors.kinesis.proxy.GetShardListResult;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy;
+import org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.auth.EnvironmentVariableCredentialsProvider;
+import com.amazonaws.client.builder.AwsClientBuilder;
+import com.amazonaws.services.kinesis.AmazonKinesis;
+import com.amazonaws.services.kinesis.AmazonKinesisClientBuilder;
+import com.amazonaws.services.kinesis.model.PutRecordRequest;
+import com.amazonaws.services.kinesis.model.PutRecordResult;
+import com.amazonaws.services.kinesis.model.ResourceNotFoundException;
+import com.amazonaws.services.schemaregistry.common.AWSDeserializerInput;
+import com.amazonaws.services.schemaregistry.common.AWSSerializerInput;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import com.amazonaws.services.schemaregistry.utils.AvroRecordType;
+import org.apache.avro.generic.GenericRecord;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.nio.ByteBuffer;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.UUID;
+
+/**
+ * Simple client to publish and retrieve messages, using the AWS Kinesis SDK, Flink Kinesis
+ * Connectors and Glue Schema Registry classes.
+ */
+public class GSRKinesisPubsubClient {
+    private static final Logger LOG = LoggerFactory.getLogger(GSRKinesisPubsubClient.class);
+
+    private final AmazonKinesis kinesisClient;
+    private final Properties properties;
+
+    public GSRKinesisPubsubClient(Properties properties) {
+        this.kinesisClient = createClientWithCredentials(properties);
+        this.properties = properties;
+    }
+
+    public void createStream(String stream, int shards, Properties props) throws Exception {
+        try {
+            kinesisClient.describeStream(stream);
+            kinesisClient.deleteStream(stream);
+        } catch (ResourceNotFoundException rnfe) {

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483






----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794286699


   https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8955&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944) 
   * f793772656ad942f463a23b0dd43f3522f147493 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574800498



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       I mean the testing of how customer would use this module.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r587203685



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,259 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.9</httpclient.version>
+		<httpcore.version>4.4.11</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+		<reactivestreams.version>1.0.2</reactivestreams.version>
+		<lz4.version>1.6.0</lz4.version>
+		<netty.version>4.1.53.Final</netty.version>
+		<guava.version>29.0-jre</guava.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>
+		<dependencies>
+			<!-- dependencies to solve enforcer check issue -->
+			<!-- can be removed once new version of Glue Schema Registry releases -->
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>http-client-spi</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>aws-core</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>protocol-core</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>annotations</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>utils</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>apache-client</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.reactivestreams</groupId>
+				<artifactId>reactive-streams</artifactId>
+				<version>${reactivestreams.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.lz4</groupId>
+				<artifactId>lz4-java</artifactId>
+				<version>${lz4.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>aws-json-protocol</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>regions</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>sdk-core</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>io.netty</groupId>
+				<artifactId>netty-codec-http</artifactId>
+				<version>${netty.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>netty-nio-client</artifactId>
+				<version>${aws.sdkv2.version}</version>
+				<scope>runtime</scope>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>auth</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>metrics-spi</artifactId>
+				<version>${aws.sdkv2.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>io.netty</groupId>
+				<artifactId>netty-handler</artifactId>
+				<version>${netty.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>com.google.guava</groupId>
+				<artifactId>guava</artifactId>
+				<version>${guava.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.apache.httpcomponents</groupId>
+				<artifactId>httpclient</artifactId>
+				<version>${httpclient.version}</version>
+			</dependency>
+
+			<dependency>
+				<groupId>org.apache.httpcomponents</groupId>
+				<artifactId>httpcore</artifactId>
+				<version>${httpcore.version}</version>
+			</dependency>
+		</dependencies>
+	</dependencyManagement>
+
+	<dependencies>
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-kinesis-test_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro-glue-schema-registry</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>junit</groupId>
+			<artifactId>junit</artifactId>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<type>test-jar</type>
+		</dependency>
+
+		<dependency>
+			<groupId>software.amazon.awssdk</groupId>
+			<artifactId>sdk-core</artifactId>
+			<version>${aws.sdkv2.version}</version>
+		</dependency>
+		<dependency>
+			<groupId>com.amazonaws</groupId>
+			<artifactId>aws-java-sdk-kinesis</artifactId>
+			<version>${aws.sdk.version}</version>
+		</dependency>
+	</dependencies>
+
+	<build>
+		<plugins>
+			<!-- Use the shade plugin to build a fat jar for the Kinesis connector test -->
+			<plugin>
+				<groupId>org.apache.maven.plugins</groupId>
+				<artifactId>maven-shade-plugin</artifactId>

Review comment:
       Test modules don't need a NOTICE file, since they are not shipping fat jars to maven.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574088029



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<!-- core dependencies -->
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-core</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-clients_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>

Review comment:
       I remember it's required when I test the package. If it should be provided by user application then I'll remove it.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574171820



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderTest.java
##########
@@ -0,0 +1,287 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import com.amazonaws.services.schemaregistry.caching.AWSSchemaRegistrySerializerCache;
+import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient;
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import org.apache.avro.Schema;
+import org.hamcrest.Matchers;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.EnumSource;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.services.glue.model.EntityNotFoundException;
+
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.io.InputStream;
+import java.lang.reflect.Field;
+import java.nio.ByteBuffer;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.CoreMatchers.notNullValue;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.junit.jupiter.api.Assertions.assertThrows;
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.ArgumentMatchers.anyMap;
+import static org.mockito.ArgumentMatchers.anyString;
+import static org.mockito.Mockito.doCallRealMethod;
+import static org.mockito.Mockito.spy;
+import static org.mockito.Mockito.when;

Review comment:
       We dont have a mock client or docker image for AWS Schema Registry to work with currently. What alternatives do we have here?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-804215539


   @jiamo this is currently not possible out of the box, but can be achieved with some tweaks. Once you have Flink instantiating and using the Glue Data Catalog client, you would still need to implement property translation to sources/sinks etc (Kinesis/Kafka etc). 
   
   What is your use-case? 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d6f06f07e0895117b345b99533ba7eda672ba765 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489) 
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 97864e82641279ac227eda19b938ebce62262867 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316) 
   * 5779d8d7942ed9d747a4f07043c3fad3d1ff82f0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331) 
   * cb96570901cbca2a6c9fdefc98c4154839194fc1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574733754



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderTest.java
##########
@@ -0,0 +1,287 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import com.amazonaws.services.schemaregistry.caching.AWSSchemaRegistrySerializerCache;
+import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient;
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import org.apache.avro.Schema;
+import org.hamcrest.Matchers;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.EnumSource;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.services.glue.model.EntityNotFoundException;
+
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.io.InputStream;
+import java.lang.reflect.Field;
+import java.nio.ByteBuffer;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.CoreMatchers.notNullValue;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.junit.jupiter.api.Assertions.assertThrows;
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.ArgumentMatchers.anyMap;
+import static org.mockito.ArgumentMatchers.anyString;
+import static org.mockito.Mockito.doCallRealMethod;
+import static org.mockito.Mockito.spy;
+import static org.mockito.Mockito.when;

Review comment:
       For example getting rid of the mocking in `testReadSchema_withValidParams_succeeds()` is actually fairly easy:
   
   ```diff
   --- a/flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderTest.java
   +++ b/flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderTest.java
   @@ -21,6 +21,7 @@ package org.apache.flink.formats.avro.glue.schema.registry;
    import com.amazonaws.services.schemaregistry.caching.AWSSchemaRegistrySerializerCache;
    import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient;
    import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
   +import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
    import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
    import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
    import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
   @@ -64,7 +65,8 @@ public class GlueSchemaRegistryAvroSchemaCoderTest {
        @Mock private AWSSchemaRegistryClient mockClient;
        @Mock private AwsCredentialsProvider mockCred;
        @Mock private GlueSchemaRegistryConfiguration mockConfigs;
   -    @Mock private GlueSchemaRegistryInputStreamDeserializer mockInputStreamDeserializer;
   +    private GlueSchemaRegistryInputStreamDeserializer mockInputStreamDeserializer =
   +            new MockGlueSchemaRegistryInputStreamDeserializer();
   
        private static Schema userSchema;
        private static User userDefinedPojo;
   @@ -83,6 +85,18 @@ public class GlueSchemaRegistryAvroSchemaCoderTest {
                    8, 116, 101, 115, 116, 0, 20, 0, 12, 118, 105, 111, 108, 101, 116
                };
   
   +    private static class MockGlueSchemaRegistryInputStreamDeserializer
   +            extends GlueSchemaRegistryInputStreamDeserializer {
   +        public MockGlueSchemaRegistryInputStreamDeserializer() {
   +            super((AWSDeserializer) null);
   +        }
   +
   +        @Override
   +        public Schema getSchemaAndDeserializedStream(InputStream in) throws IOException {
   +            return userSchema;
   +        }
   +    }
   +
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577864934



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>

Review comment:
       The dependency chain is fixed in GSR package but it'll need some time to release. Once it's out, it should also fixed the enforcer check issue.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r565363984



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*

Review comment:
       It seems that the package name is not properly encoded into subdirectories.
   Part of the directory name of this file is `org.apache.flink.glue.schema.registry.test`, but it should be `org/apache/flink/glue/schema/registry/test`. This might be difficult to see in some IDEs, as they are replacing this directory structure with the dot-notation.

##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GlueSchemaRegistryExampleTest.java
##########
@@ -0,0 +1,146 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.glue.schema.registry.test;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.api.java.utils.ParameterTool;
+
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericData;
+import org.apache.avro.generic.GenericRecord;
+import org.junit.Assert;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.IOException;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.concurrent.atomic.AtomicReference;
+
+/** Test driver for {@link GlueSchemaRegistryExample#main}. */
+public class GlueSchemaRegistryExampleTest {
+    private static final Logger LOG = LoggerFactory.getLogger(GlueSchemaRegistryExampleTest.class);
+
+    public static void main(String[] args) throws Exception {
+        LOG.info("System properties: {}", System.getProperties());
+        final ParameterTool parameterTool = ParameterTool.fromArgs(args);
+
+        String inputStream = parameterTool.getRequired("input-stream");
+        String outputStream = parameterTool.getRequired("output-stream");
+
+        GSRKinesisPubsubClient pubsub = new GSRKinesisPubsubClient(parameterTool.getProperties());
+        pubsub.createTopic(inputStream, 2, parameterTool.getProperties());
+        pubsub.createTopic(outputStream, 2, parameterTool.getProperties());
+
+        // The example job needs to start after streams are created and run in parallel to the
+        // validation logic.
+        // The thread that runs the job won't terminate, we don't have a job reference to cancel it.
+        // Once results are validated, the driver main thread will exit; job/cluster will be
+        // terminated from script.
+        final AtomicReference<Exception> executeException = new AtomicReference<>();
+        Thread executeThread =
+                new Thread(
+                        () -> {
+                            try {
+                                GlueSchemaRegistryExample.main(args);
+                                // this message won't appear in the log,
+                                // job is terminated when shutting down cluster
+                                LOG.info("executed program");
+                            } catch (Exception e) {
+                                executeException.set(e);
+                            }
+                        });
+        executeThread.start();
+
+        List<GenericRecord> messages = getRecords();
+        for (GenericRecord msg : messages) {
+            pubsub.sendMessage(GlueSchemaRegistryExample.getSchema().toString(), inputStream, msg);
+        }
+        LOG.info("generated records");
+
+        Deadline deadline = Deadline.fromNow(Duration.ofSeconds(60));
+        List<Object> results = pubsub.readAllMessages(outputStream);
+        while (deadline.hasTimeLeft()
+                && executeException.get() == null
+                && results.size() < messages.size()) {
+            LOG.info("waiting for results..");
+            Thread.sleep(1000);
+            results = pubsub.readAllMessages(outputStream);
+        }
+
+        if (executeException.get() != null) {
+            throw executeException.get();
+        }
+
+        LOG.info("results: {}", results);
+        Assert.assertEquals(
+                "Results received from '" + outputStream + "': " + results,
+                messages.size(),
+                results.size());
+
+        List<GenericRecord> expectedResults = getRecords();
+
+        for (Object expectedResult : expectedResults) {
+            Assert.assertTrue(results.contains(expectedResult));
+        }
+
+        // TODO: main thread needs to create job or CLI fails with:
+        // "The program didn't contain a Flink job. Perhaps you forgot to call execute() on the
+        // execution environment."
+        System.out.println("test finished");

Review comment:
       use `LOG`?

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryInputStreamDeserializer.java
##########
@@ -0,0 +1,85 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.utils.MutableByteArrayInputStream;
+
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import org.apache.avro.Schema;
+import org.apache.avro.SchemaParseException;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry input stream de-serializer to accept input stream and extract schema
+ * from it and remove schema registry information in the input stream.
+ */
+public class GlueSchemaRegistryInputStreamDeserializer {
+    private final AWSDeserializer awsDeserializer;
+
+    /**
+     * Constructor accepts configuration map for AWS Deserializer.
+     *
+     * @param configs configuration map
+     */
+    public GlueSchemaRegistryInputStreamDeserializer(Map<String, Object> configs) {
+        awsDeserializer =
+                AWSDeserializer.builder()
+                        .credentialProvider(DefaultCredentialsProvider.builder().build())
+                        .configs(configs)
+                        .build();
+    }
+
+    public GlueSchemaRegistryInputStreamDeserializer(AWSDeserializer awsDeserializer) {
+        this.awsDeserializer = awsDeserializer;
+    }
+
+    /**
+     * Get schema and remove extra Schema Registry information within input stream.
+     *
+     * @param in input stream
+     * @return schema of object within input stream
+     * @throws IOException Exception during decompression
+     */
+    public Schema getSchemaAndDeserializedStream(InputStream in) throws IOException {
+        byte[] inputBytes = new byte[in.available()];
+        in.read(inputBytes);
+        in.reset();
+
+        MutableByteArrayInputStream mutableByteArrayInputStream = (MutableByteArrayInputStream) in;
+        String schemaDefinition = awsDeserializer.getSchema(inputBytes).getSchemaDefinition();
+        byte[] deserializedBytes = awsDeserializer.getActualData(inputBytes);
+        mutableByteArrayInputStream.setBuffer(deserializedBytes);
+
+        Schema schema;
+        try {
+            schema = (new Schema.Parser()).parse(schemaDefinition);

Review comment:
       if I'm not mistaken, this parser initialization and schema parsing is done for every `RegistryAvroDeserializationSchema.deserialize()` call.
   I guess this is necessary when deserializing GenericRecord Avro records, but for SpecificRecord we only need to deserialize the schema once?

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderTest.java
##########
@@ -0,0 +1,287 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import com.amazonaws.services.schemaregistry.caching.AWSSchemaRegistrySerializerCache;
+import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient;
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import org.apache.avro.Schema;
+import org.hamcrest.Matchers;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.EnumSource;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.services.glue.model.EntityNotFoundException;
+
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.io.InputStream;
+import java.lang.reflect.Field;
+import java.nio.ByteBuffer;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.CoreMatchers.notNullValue;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.junit.jupiter.api.Assertions.assertThrows;
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.ArgumentMatchers.anyMap;
+import static org.mockito.ArgumentMatchers.anyString;
+import static org.mockito.Mockito.doCallRealMethod;
+import static org.mockito.Mockito.spy;
+import static org.mockito.Mockito.when;

Review comment:
       Per the "Apache Flink Code Style and Quality Guide" (https://flink.apache.org/contributing/code-style-and-quality-common.html) the use of mockito is not recommended in tests. Check the document, and also the slides linked there for more details.

##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,72 @@
+#!/usr/bin/env bash

Review comment:
       I'm feeling a bit sorry that I didn't tell you beforehand, but the Flink community decided recently to stop adding new bash-based e2e tests: https://lists.apache.org/thread.html/rdc2894c67c6da3fa92f85ec2cde1d5a0c551748050431d36c13bf7a3%40%3Cdev.flink.apache.org%3E
   We have a Java e2e test framework that is going to replace the bash tests in the long run. See `SQLClientSchemaRegistryITCase` or some other examples there.

##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>

Review comment:
       why do we have to skip the maven enforcer rules?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-799565686


   CI is passing with and without WS credentials. I will merge this now.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578256434



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderProvider.java
##########
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import javax.annotation.Nonnull;
+
+import java.util.Map;
+
+/** Provider for {@link GlueSchemaRegistryAvroSchemaCoder}. */
+public class GlueSchemaRegistryAvroSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {
+    private final String transportName;
+    private final Map<String, Object> configs;
+
+    /**
+     * Constructor used by {@link GlueSchemaRegistryAvroDeserializationSchema} and {@link
+     * GlueSchemaRegistryAvroSerializationSchema}.
+     *
+     * @param transportName topic name or stream name etc.
+     * @param configs configurations for AWS Glue Schema Registry
+     */
+    public GlueSchemaRegistryAvroSchemaCoderProvider(
+            String transportName, @Nonnull Map<String, Object> configs) {

Review comment:
       As previously discussed, we should not usually annotate `@Nonnull`. Instead we should annotate `@Nullable`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-792069594


   > https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8944&view=results
   
   Hi Robert, the CI failed because it ran my e2e test. Is it because I added the secrets forwarding work?
   
   ```
   2021-03-06T03:20:28.9621676Z org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Exception occurred while fetching or registering schema definition = {"type":"record","name":"User","namespace":"org.apache.flink.glue.schema.registry.test","fields":[{"name":"name","type":"string"},{"name":"favorite_number","type":["int","null"]},{"name":"favorite_color","type":["string","null"]}]}, schema name = gsr-input-stream 
   2021-03-06T03:20:28.9625458Z 	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372)
   2021-03-06T03:20:28.9629120Z 	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
   2021-03-06T03:20:28.9631833Z 	at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
   2021-03-06T03:20:28.9634712Z 	at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
   2021-03-06T03:20:28.9637432Z 	at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
   2021-03-06T03:20:28.9640102Z 	at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
   2021-03-06T03:20:28.9642641Z 	at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
   2021-03-06T03:20:28.9645230Z 	at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
   2021-03-06T03:20:28.9647741Z 	at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
   2021-03-06T03:20:28.9651870Z Caused by: com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException: Exception occurred while fetching or registering schema definition = {"type":"record","name":"User","namespace":"org.apache.flink.glue.schema.registry.test","fields":[{"name":"name","type":"string"},{"name":"favorite_number","type":["int","null"]},{"name":"favorite_color","type":["string","null"]}]}, schema name = gsr-input-stream 
   2021-03-06T03:20:28.9655740Z 	at com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient.getORRegisterSchemaVersionId(AWSSchemaRegistryClient.java:190)
   2021-03-06T03:20:28.9658830Z 	at com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializationFacade.getOrRegisterSchemaVersion(GlueSchemaRegistrySerializationFacade.java:84)
   2021-03-06T03:20:28.9662027Z 	at com.amazonaws.services.schemaregistry.serializers.avro.AWSAvroSerializer.registerSchema(AWSAvroSerializer.java:59)
   2021-03-06T03:20:28.9665706Z 	at org.apache.flink.glue.schema.registry.test.GSRKinesisPubsubClient.sendMessage(GSRKinesisPubsubClient.java:103)
   2021-03-06T03:20:28.9669840Z 	at org.apache.flink.glue.schema.registry.test.GlueSchemaRegistryExampleTest.main(GlueSchemaRegistryExampleTest.java:73)
   2021-03-06T03:20:28.9672485Z 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   2021-03-06T03:20:28.9693855Z 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   2021-03-06T03:20:28.9694668Z 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   2021-03-06T03:20:28.9695132Z 	at java.lang.reflect.Method.invoke(Method.java:498)
   2021-03-06T03:20:28.9695580Z 	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
   2021-03-06T03:20:28.9695953Z 	... 8 more
   2021-03-06T03:20:28.9696928Z Caused by: com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException: Failed to get schemaVersionId by schema definition for schema name = gsr-input-stream 
   2021-03-06T03:20:28.9697831Z 	at com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient.getSchemaVersionIdByDefinition(AWSSchemaRegistryClient.java:136)
   2021-03-06T03:20:28.9698500Z 	at com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient.getORRegisterSchemaVersionId(AWSSchemaRegistryClient.java:167)
   2021-03-06T03:20:28.9698930Z 	... 17 more
   2021-03-06T03:20:28.9702818Z Caused by: software.amazon.awssdk.core.exception.SdkClientException: Unable to load credentials from any of the providers in the chain AwsCredentialsProviderChain(credentialsProviders=[SystemPropertyCredentialsProvider(), EnvironmentVariableCredentialsProvider(), WebIdentityTokenCredentialsProvider(), ProfileCredentialsProvider(), ContainerCredentialsProvider(), InstanceProfileCredentialsProvider()]) : [SystemPropertyCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., EnvironmentVariableCredentialsProvider(): Unable to load credentials from system settings. Secret key must be specified either via environment variable (AWS_SECRET_ACCESS_KEY) or system property (aws.secretAccessKey)., WebIdentityTokenCredentialsProvider(): Either the environment variable AWS_WEB_IDENTITY_TOKEN_FILE or the javaproperty aws.webIdentity
 TokenFile must be set., ProfileCredentialsProvider(): Profile file contained no credentials for profile 'default': ProfileFile(profiles=[]), ContainerCredentialsProvider(): Cannot fetch credentials from container - neither AWS_CONTAINER_CREDENTIALS_FULL_URI or AWS_CONTAINER_CREDENTIALS_RELATIVE_URI environment variables are set., InstanceProfileCredentialsProvider(): The requested metadata is not found at http://169.254.169.254/latest/meta-data/iam/security-credentials/]
   2021-03-06T03:20:28.9705966Z 	at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:98)
   2021-03-06T03:20:28.9706898Z 	at software.amazon.awssdk.auth.credentials.AwsCredentialsProviderChain.resolveCredentials(AwsCredentialsProviderChain.java:112)
   2021-03-06T03:20:28.9707712Z 	at software.amazon.awssdk.auth.credentials.internal.LazyAwsCredentialsProvider.resolveCredentials(LazyAwsCredentialsProvider.java:45)
   2021-03-06T03:20:28.9708523Z 	at software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider.resolveCredentials(DefaultCredentialsProvider.java:104)
   2021-03-06T03:20:28.9709131Z 	at software.amazon.awssdk.awscore.client.handler.AwsClientHandlerUtils.createExecutionContext(AwsClientHandlerUtils.java:76)
   2021-03-06T03:20:28.9709738Z 	at software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.createExecutionContext(AwsSyncClientHandler.java:68)
   2021-03-06T03:20:28.9710318Z 	at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.lambda$execute$1(BaseSyncClientHandler.java:97)
   2021-03-06T03:20:28.9710908Z 	at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.measureApiCallSuccess(BaseSyncClientHandler.java:167)
   2021-03-06T03:20:28.9711482Z 	at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.execute(BaseSyncClientHandler.java:94)
   2021-03-06T03:20:28.9712252Z 	at software.amazon.awssdk.core.client.handler.SdkSyncClientHandler.execute(SdkSyncClientHandler.java:45)
   2021-03-06T03:20:28.9712784Z 	at software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.execute(AwsSyncClientHandler.java:55)
   2021-03-06T03:20:28.9713379Z 	at software.amazon.awssdk.services.glue.DefaultGlueClient.getSchemaByDefinition(DefaultGlueClient.java:6710)
   2021-03-06T03:20:28.9714087Z 	at com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryClient.getSchemaVersionIdByDefinition(AWSSchemaRegistryClient.java:132)
   2021-03-06T03:20:28.9714731Z 	... 18 more
   2021-03-06T03:20:29.2126395Z Mar 06 03:20:29 Stopping taskexecutor daemon (pid: 12562) on host fv-az159-607.
   2021-03-06T03:20:30.3662777Z Mar 06 03:20:30 Stopping standalonesession daemon (pid: 12299) on host fv-az159-607.
   2021-03-06T03:20:31.4981598Z Mar 06 03:20:31 terminating kinesalite
   2021-03-06T03:20:31.7457600Z Mar 06 03:20:31 flink-glue-schema-registry-test
   2021-03-06T03:20:31.7498206Z Mar 06 03:20:31 [FAIL] Test script contains errors.
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] mohitpali commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
mohitpali commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-767211338


   Apologies for the confusion, we have close the other PR. We had to create another PR because two developers were working on it and hence the different login. I have included some CI compilation fixes in this PR and rebased.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 5779d8d7942ed9d747a4f07043c3fad3d1ff82f0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331) 
   * cb96570901cbca2a6c9fdefc98c4154839194fc1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335) 
   * bafd3a9a41461850057e9f2d50c3cea1c52aad7a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 061978aeb474636d954f85b0408b00a21f2571e5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794) 
   * 46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577862543



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderProvider.java
##########
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.NonNull;

Review comment:
       `transportName` can be null because we've set default value for it.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148) Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112) 
   * c77e3d4a1a7484c4f1e24b23a1099364b834cf75 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794477315


   It failed again: https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8955&view=logs&j=9401bf33-03c4-5a24-83fe-e51d75db73ef&t=72901ab2-7cd0-57be-82b1-bca51de20fba


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-792643073


   > It looks like the GSR test is running even when the creds are not set. Can we try setting:
   > 
   > ```
   > if [ -n "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ -n "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
   >   run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   > fi
   > ```
   > 
   > to
   > 
   > ```
   > if [ "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
   >   run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   > fi
   > ```
   
   Updated


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 removed a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 removed a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-794238904


   > password
   
   Not yet, I haven't used Azure before. Could you quickly walk me through what to do to set up CI?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r587263383



##########
File path: flink-formats/flink-avro-glue-schema-registry/pom.xml
##########
@@ -0,0 +1,99 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-formats</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-avro-glue-schema-registry</artifactId>
+	<name>Flink : Formats : Avro AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<glue.schema.registry.version>1.0.0</glue.schema.registry.version>
+		<junit.jupiter.version>5.6.2</junit.jupiter.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>

Review comment:
       We will fix this in a follow up




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r583566598



##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,78 @@
+#!/usr/bin/env bash
+################################################################################
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+################################################################################
+# To run this test locally, AWS credential is required.

Review comment:
       I haven't met this error before. But I think it's related with Flink-Kinesis connector. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766724268


   What's the relationship of this PR to https://github.com/apache/flink/pull/14490 ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r576978089



##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,72 @@
+#!/usr/bin/env bash

Review comment:
       Have discussed this with @LinyuYao1021 and agreed to accept the `sh` test and fast follow migrate to the Java solution. We will raise a follow up Jira and I will perform the migration. Since the migration is mostly going to consist of adding support for Kinesalite, I will then look at the Kinesis e2e tests since they should be similar.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 5779d8d7942ed9d747a4f07043c3fad3d1ff82f0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331) 
   * cb96570901cbca2a6c9fdefc98c4154839194fc1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335) 
   * bafd3a9a41461850057e9f2d50c3cea1c52aad7a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 399f06e14079a512d35814508b6f7598d7d175ba Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * c77e3d4a1a7484c4f1e24b23a1099364b834cf75 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153) 
   * a87a26aa218658f7098367cae8c7a2ed18430296 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-801122674


   Very nice! Thanks a lot for your efforts!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 removed a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 removed a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793563537


   > > It looks like the GSR test is running even when the creds are not set. Can we try setting:
   > > ```
   > > if [ -n "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ -n "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
   > >   run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   > > fi
   > > ```
   > > 
   > > 
   > > to
   > > ```
   > > if [ "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
   > >   run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   > > fi
   > > ```
   > 
   > Updated
   
   After diving into this, I find that `-n` means whether a string is empty, which is same with the length is zero. So we can use it for the condition check. The problem is we should use `[[ ]]` instead of `[ ]` to ensure 
   
   > The `-z` approach should work: https://github.com/apache/flink/blob/master/flink-end-to-end-tests/test-scripts/common_s3.sh#L25
   > but it's worth a try.
   > 
   > Can you also add
   > 
   > ```
   > SECRET_S3_ACCESS_KEY: $[variables.IT_CASE_S3_ACCESS_KEY]
   > SECRET_S3_SECRET_KEY: $[variables.IT_CASE_S3_SECRET_KEY]
   > ```
   > 
   > to `build-apache-repo.yml`?
   
   Yes, I've added in the latest commit.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793576430


   > Why the s3 access key is needed for my case?
   
   
   Sorry, I meant 
   ```
     SECRET_GLUE_SCHEMA_ACCESS_KEY: $[variables.IT_CASE_GLUE_SCHEMA_ACCESS_KEY]
     SECRET_GLUE_SCHEMA_SECRET_KEY: $[variables.IT_CASE_GLUE_SCHEMA_SECRET_KEY]
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577987171



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderProvider.java
##########
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.NonNull;

Review comment:
       Ah ok, Flink convention is that `@Nonnul` is default and `@Nullable` fields should be annotated. Please change accordingly

##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSchemaCoderProvider.java
##########
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.NonNull;

Review comment:
       Ah ok, Flink convention is that `@Nonnull` is assumed default and `@Nullable` fields should be annotated. Please change accordingly




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-791255225


   > Both CIs are green, however, the e2e test didn't execute on my CI. I'll quickly try to fix that.
   
   Thanks Robert. Please give me a feedback when it's working.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-791348220


   Looks like the test failed: https://dev.azure.com/rmetzger/Flink/_build/results?buildId=8940&view=logs&j=9401bf33-03c4-5a24-83fe-e51d75db73ef&t=72901ab2-7cd0-57be-82b1-bca51de20fba


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577868644



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,252 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.3</httpclient.version>
+		<httpcore.version>4.4.6</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+	</properties>
+
+	<!-- ============================= -->
+	<!-- DEPENDENCY MANAGEMENT -->
+	<!-- ============================= -->
+	<dependencyManagement>
+		<dependencies>
+			<dependency>
+				<groupId>software.amazon.awssdk</groupId>
+				<artifactId>http-client-spi</artifactId>
+				<version>2.15.32</version>

Review comment:
       Will update the version. Unit test failed due to changes lost during rebasing. Will fix in next commit




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 061978aeb474636d954f85b0408b00a21f2571e5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794) 
   * 46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-795179693


   This is my pipeline, directly to run my commit to see what's the problem:
   
   - https://dev.azure.com/yaolinyu3547/yaoliny.flink/_build/results?buildId=1&view=results


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395) 
   * 86353afe862e41b51dc48caf19f48fc03e6246b0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397) 
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 81adc9cccd9fc0247e58ad7688252e1d98382cc5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793006791


   There are no secrets setup for the main CI (which runs the pull request validation). We can not do this, because people could steal our credentials by opening a pull request exporting the secrets from the env variables.
   My Azure account has the secrets set up.
   I would recommend you to do the same on your personal Azure account (It's free). 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r574345802



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryAvroSerializationSchema.java
##########
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.annotation.VisibleForTesting;
+import org.apache.flink.formats.avro.RegistryAvroSerializationSchema;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import lombok.SneakyThrows;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.avro.io.Encoder;
+import org.apache.avro.specific.SpecificRecord;
+
+import javax.annotation.Nullable;
+
+import java.io.ByteArrayOutputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry Serialization schema to serialize to Avro binary format for Flink
+ * Producer user.
+ *
+ * @param <T> the type to be serialized
+ */
+public class GlueSchemaRegistryAvroSerializationSchema<T>
+        extends RegistryAvroSerializationSchema<T> {
+    /**
+     * Creates an Avro serialization schema.
+     *
+     * @param recordClazz class to serialize. Should be one of: {@link SpecificRecord}, {@link
+     *     GenericRecord}.
+     * @param reader reader's Avro schema. Should be provided if recordClazz is {@link
+     *     GenericRecord}
+     * @param schemaCoderProvider schema coder provider which reads writer schema from AWS Glue
+     *     Schema Registry
+     */
+    private GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz,
+            @Nullable Schema reader,
+            SchemaCoder.SchemaCoderProvider schemaCoderProvider) {
+        super(recordClazz, reader, schemaCoderProvider);
+    }
+
+    @VisibleForTesting
+    protected GlueSchemaRegistryAvroSerializationSchema(
+            Class<T> recordClazz, @Nullable Schema reader, SchemaCoder schemaCoder) {
+        // Pass null schema coder provider
+        super(recordClazz, reader, null);
+        this.schemaCoder = schemaCoder;
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * GenericRecord} using provided schema.
+     *
+     * @param schema the schema that will be used for serialization
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of AWS Glue Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static GlueSchemaRegistryAvroSerializationSchema<GenericRecord> forGeneric(
+            Schema schema, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                GenericRecord.class,
+                schema,
+                new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Creates {@link GlueSchemaRegistryAvroSerializationSchema} that serializes {@link
+     * SpecificRecord} using provided schema.
+     *
+     * @param clazz the type to be serialized
+     * @param transportName topic name or stream name etc.
+     * @param configs configuration map of Amazon Schema Registry
+     * @return serialized record in form of byte array
+     */
+    public static <T extends SpecificRecord>
+            GlueSchemaRegistryAvroSerializationSchema<T> forSpecific(
+                    Class<T> clazz, String transportName, Map<String, Object> configs) {
+        return new GlueSchemaRegistryAvroSerializationSchema<>(
+                clazz, null, new GlueSchemaRegistryAvroSchemaCoderProvider(transportName, configs));
+    }
+
+    /**
+     * Serializes the incoming element to a byte array containing bytes of AWS Glue Schema registry
+     * information.
+     *
+     * @param object The incoming element to be serialized
+     * @return The serialized bytes.
+     */
+    @SneakyThrows
+    @Override
+    public byte[] serialize(T object) {
+        checkAvroInitialized();
+
+        if (object == null) {
+            return null;
+        } else {
+            ByteArrayOutputStream outputStream = getOutputStream();
+            outputStream.reset();
+            Encoder encoder = getEncoder();
+            getDatumWriter().write(object, encoder);
+            schemaCoder.writeSchema(getSchema(), outputStream);
+            encoder.flush();
+
+            return outputStream.toByteArray();
+        }
+    }

Review comment:
       Ah, ok 👍 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577482913



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/pom.xml
##########
@@ -0,0 +1,150 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+		 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+		 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+	<parent>
+		<artifactId>flink-end-to-end-tests</artifactId>
+		<groupId>org.apache.flink</groupId>
+		<version>1.13-SNAPSHOT</version>
+		<relativePath>..</relativePath>
+	</parent>
+	<modelVersion>4.0.0</modelVersion>
+
+	<artifactId>flink-glue-schema-registry-test_${scala.binary.version}</artifactId>
+	<name>Flink : E2E Tests : AWS Glue Schema Registry</name>
+	<packaging>jar</packaging>
+
+	<properties>
+		<httpclient.version>4.5.9</httpclient.version>
+		<httpcore.version>4.4.11</httpcore.version>
+		<aws.sdk.version>1.11.754</aws.sdk.version>
+		<aws.sdkv2.version>2.15.32</aws.sdkv2.version>
+		<enforcer.skip>true</enforcer.skip>
+	</properties>
+
+	<dependencies>
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+			<scope>provided</scope>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-streaming-kinesis-test_${scala.binary.version}</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-avro-glue-schema-registry</artifactId>
+			<version>${project.version}</version>
+		</dependency>
+
+		<dependency>
+			<groupId>junit</groupId>
+			<artifactId>junit</artifactId>
+			<version>${junit.version}</version>
+			<scope>compile</scope>

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766279661


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit 399f06e14079a512d35814508b6f7598d7d175ba (Sun Jan 24 02:36:14 UTC 2021)
   
   **Warnings:**
    * **4 pom.xml files were touched**: Check for build and licensing issues.
    * No documentation files were touched! Remember to keep the Flink docs up to date!
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-768331002


   Thanks a lot for the clarification


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r583652106



##########
File path: flink-end-to-end-tests/test-scripts/test_glue_schema_registry.sh
##########
@@ -0,0 +1,78 @@
+#!/usr/bin/env bash
+################################################################################
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+################################################################################
+
+################################################################################
+# To run this test locally, AWS credential is required.

Review comment:
       Sorry that was the wrong exception, this is the cause:
   
   ```
   com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException: Error occurred while parsing schema, see inner exception for details.
   	at org.apache.flink.formats.avro.glue.schema.registry.GlueSchemaRegistryInputStreamDeserializer.getSchemaAndDeserializedStream(GlueSchemaRegistryInputStreamDeserializer.java:82) ~[?:?]
   	at org.apache.flink.formats.avro.glue.schema.registry.GlueSchemaRegistryAvroSchemaCoder.readSchema(GlueSchemaRegistryAvroSchemaCoder.java:72) ~[?:?]
   	at org.apache.flink.formats.avro.RegistryAvroDeserializationSchema.deserialize(RegistryAvroDeserializationSchema.java:73) ~[?:?]
   	at org.apache.flink.streaming.connectors.kinesis.serialization.KinesisDeserializationSchemaWrapper.deserialize(KinesisDeserializationSchemaWrapper.java:71) ~[?:?]
   	at org.apache.flink.streaming.connectors.kinesis.internals.ShardConsumer.deserializeRecordForCollectionAndUpdateState(ShardConsumer.java:192) ~[?:?]
   	at org.apache.flink.streaming.connectors.kinesis.internals.ShardConsumer.lambda$run$0(ShardConsumer.java:125) ~[?:?]
   	at org.apache.flink.streaming.connectors.kinesis.internals.ShardConsumer$$Lambda$406/0x0000000000000000.accept(Unknown Source) ~[?:?]
   	at org.apache.flink.streaming.connectors.kinesis.internals.publisher.polling.PollingRecordPublisher.run(PollingRecordPublisher.java:117) ~[?:?]
   	at org.apache.flink.streaming.connectors.kinesis.internals.publisher.polling.PollingRecordPublisher.run(PollingRecordPublisher.java:101) ~[?:?]
   	at org.apache.flink.streaming.connectors.kinesis.internals.ShardConsumer.run(ShardConsumer.java:113) ~[?:?]
   	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
   	at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
   	at java.lang.Thread.run(Thread.java:836) ~[?:?]
   Caused by: org.apache.avro.SchemaParseException: Can't redefine: org.apache.flink.glue.schema.registry.test.User
   	at org.apache.avro.Schema$Names.put(Schema.java:1542) ~[?:?]
   	at org.apache.avro.Schema$Names.add(Schema.java:1536) ~[?:?]
   	at org.apache.avro.Schema.parse(Schema.java:1655) ~[?:?]
   	at org.apache.avro.Schema$Parser.parse(Schema.java:1425) ~[?:?]
   	at org.apache.avro.Schema$Parser.parse(Schema.java:1413) ~[?:?]
   	at org.apache.flink.formats.avro.glue.schema.registry.GlueSchemaRegistryInputStreamDeserializer.getSchemaAndDeserializedStream(GlueSchemaRegistryInputStreamDeserializer.java:78) ~[?:?]
   	... 14 more
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dannycranmer commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
dannycranmer commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-792609755


   It looks like the GSR test is running even when the creds are not set. Can we try setting:
   
   ```
   if [ -n "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ -n "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
     run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   fi
   ```
   
   to
   
   ```
   if [ "$IT_CASE_GLUE_SCHEMA_ACCESS_KEY" ] && [ "$IT_CASE_GLUE_SCHEMA_SECRET_KEY" ]; then
     run_test "AWS Glue Schema Registry nightly end-to-end test" "$END_TO_END_DIR/test-scripts/test_glue_schema_registry.sh"
   fi
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14331",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14335",
       "triggerID" : "cb96570901cbca2a6c9fdefc98c4154839194fc1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14340",
       "triggerID" : "bafd3a9a41461850057e9f2d50c3cea1c52aad7a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14504",
       "triggerID" : "c4eb439a79d18f8296055e3582a0093146cbacc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "288129b58edfbe6928104e517e3e7243e4b6e0d2",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14510",
       "triggerID" : "288129b58edfbe6928104e517e3e7243e4b6e0d2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * c4eb439a79d18f8296055e3582a0093146cbacc7 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14504) 
   * 288129b58edfbe6928104e517e3e7243e4b6e0d2 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14510) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     }, {
       "hash" : "97864e82641279ac227eda19b938ebce62262867",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316",
       "triggerID" : "97864e82641279ac227eda19b938ebce62262867",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "5779d8d7942ed9d747a4f07043c3fad3d1ff82f0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 97864e82641279ac227eda19b938ebce62262867 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14316) 
   * 5779d8d7942ed9d747a4f07043c3fad3d1ff82f0 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-793518186


   Oh, I didn't notice that the main CI isn't working either. The condition looks fine.
   Maybe the problem is that we are not setting the environment variables here: https://github.com/apache/flink/blob/master/tools/azure-pipelines/build-apache-repo.yml#L51
   
   The build-apache-repo.yml definition is used for triggering the PR builds. Maybe azure is putting the variable name if the variable is not defined (hence the test is executed in the PR Ci build because the variable is replaced by the variable name, and the test is executed on my personal ci, because there the correct variable is set).
   
   > Why we need to pass the value of $IT_CASE_GLUE_SCHEMA_ACCESS_KEY to $SECRET_GLUE_SCHEMA_ACCESS_KEY?
   
   These are the secret variables I have defined in my personal CI:
   ![image](https://user-images.githubusercontent.com/89049/110437950-3403ce80-80b6-11eb-8472-fa3ede059afc.png)
   
   The definitions in `azure-pipelines.yml` get the defined variables (for example `variables.IT_CASE_GLUE_SCHEMA_ACCESS_KEY` and store them in an internal, temporary variable (for example `SECRET_GLUE_SCHEMA_ACCESS_KEY`), then in `jobs-template.yml` we make the environment variable available again, under the original name: `IT_CASE_GLUE_SCHEMA_ACCESS_KEY: $(SECRET_GLUE_SCHEMA_ACCESS_KEY)`.
   Why this weird transformation from "defined variable" to "internal variable" back to "environment variable"?
   I'm not 100% sure anymore, but I remember that I've struggled a lot setting up the secrets. What I know is that you need to manually declare the secret variables on the respective Azure tasks like this: 
   ```yaml
         env:
           IT_CASE_S3_BUCKET: $(SECRET_S3_BUCKET)
           IT_CASE_S3_ACCESS_KEY: $(SECRET_S3_ACCESS_KEY)
           IT_CASE_S3_SECRET_KEY: $(SECRET_S3_SECRET_KEY)
           IT_CASE_GLUE_SCHEMA_ACCESS_KEY: $(SECRET_GLUE_SCHEMA_ACCESS_KEY)
           IT_CASE_GLUE_SCHEMA_SECRET_KEY: $(SECRET_GLUE_SCHEMA_SECRET_KEY)
   ```
   ... otherwise they won't work.
   Afaik using the same name for the variable also didn't seem to have worked.
   
   What we know is that this approach works well for the S3 tests.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 81adc9cccd9fc0247e58ad7688252e1d98382cc5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401) 
   * 56566c305fba167267cf427dddbf33c62b04f997 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] rmetzger commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
rmetzger commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-767029789


   You don't need to open a PR for every fix, you can just keep (force) pushing to the branch. Can you close the old PR?
   
   I'll review the PR tomorrow.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] jiamo edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
jiamo edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-803882944


   A little question. We know with this repe https://github.com/awslabs/aws-glue-data-catalog-client-for-apache-hive-metastore .AWS EMR Hive can seamless talk with glue meta.
   But when use flink hive. It use the original metastore client.
   Is It possible to make an option that flink-connector-hive can talk with glue meta. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577390019



##########
File path: flink-end-to-end-tests/flink-glue-schema-registry-test/src/main/java/org.apache.flink.glue.schema.registry.test/GSRKinesisPubsubClient.java
##########
@@ -0,0 +1,185 @@
+/*

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 86353afe862e41b51dc48caf19f48fc03e6246b0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397) 
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 81adc9cccd9fc0247e58ad7688252e1d98382cc5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bd617892bfec1db1654606355041a6e4b9050304 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419) 
   * d6f06f07e0895117b345b99533ba7eda672ba765 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] jiamo commented on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
jiamo commented on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-804592812


   My use-case: use flink-sql  to read data from EMR managed hive (while the table content was s3 files and partition by day)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577477365



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/main/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryInputStreamDeserializer.java
##########
@@ -0,0 +1,85 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.utils.MutableByteArrayInputStream;
+
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import org.apache.avro.Schema;
+import org.apache.avro.SchemaParseException;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Map;
+
+/**
+ * AWS Glue Schema Registry input stream de-serializer to accept input stream and extract schema
+ * from it and remove schema registry information in the input stream.
+ */
+public class GlueSchemaRegistryInputStreamDeserializer {
+    private final AWSDeserializer awsDeserializer;
+
+    /**
+     * Constructor accepts configuration map for AWS Deserializer.
+     *
+     * @param configs configuration map
+     */
+    public GlueSchemaRegistryInputStreamDeserializer(Map<String, Object> configs) {
+        awsDeserializer =
+                AWSDeserializer.builder()
+                        .credentialProvider(DefaultCredentialsProvider.builder().build())
+                        .configs(configs)
+                        .build();
+    }
+
+    public GlueSchemaRegistryInputStreamDeserializer(AWSDeserializer awsDeserializer) {
+        this.awsDeserializer = awsDeserializer;
+    }
+
+    /**
+     * Get schema and remove extra Schema Registry information within input stream.
+     *
+     * @param in input stream
+     * @return schema of object within input stream
+     * @throws IOException Exception during decompression
+     */
+    public Schema getSchemaAndDeserializedStream(InputStream in) throws IOException {
+        byte[] inputBytes = new byte[in.available()];
+        in.read(inputBytes);
+        in.reset();
+
+        MutableByteArrayInputStream mutableByteArrayInputStream = (MutableByteArrayInputStream) in;
+        String schemaDefinition = awsDeserializer.getSchema(inputBytes).getSchemaDefinition();
+        byte[] deserializedBytes = awsDeserializer.getActualData(inputBytes);
+        mutableByteArrayInputStream.setBuffer(deserializedBytes);
+
+        Schema schema;
+        try {
+            schema = (new Schema.Parser()).parse(schemaDefinition);

Review comment:
       Fixed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14215",
       "triggerID" : "a87a26aa218658f7098367cae8c7a2ed18430296",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296",
       "triggerID" : "0114bac886e3c6f954632211f9a0e2f81998d1ac",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 0114bac886e3c6f954632211f9a0e2f81998d1ac Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14296) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] LinyuYao1021 commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
LinyuYao1021 commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r578008683



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/GlueSchemaRegistryInputStreamDeserializerTest.java
##########
@@ -0,0 +1,282 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.flink.formats.avro.utils.MutableByteArrayInputStream;
+
+import com.amazonaws.services.schemaregistry.common.AWSCompressionHandler;
+import com.amazonaws.services.schemaregistry.common.AWSSchemaRegistryDefaultCompression;
+import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
+import com.amazonaws.services.schemaregistry.deserializers.AWSDeserializer;
+import com.amazonaws.services.schemaregistry.exception.AWSSchemaRegistryException;
+import com.amazonaws.services.schemaregistry.utils.AWSSchemaRegistryConstants;
+import lombok.NonNull;
+import org.apache.avro.Schema;
+import org.apache.avro.io.BinaryEncoder;
+import org.apache.avro.io.DatumWriter;
+import org.apache.avro.io.EncoderFactory;
+import org.apache.avro.specific.SpecificDatumWriter;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
+import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
+
+import java.io.ByteArrayOutputStream;
+import java.io.File;
+import java.io.IOException;
+import java.nio.ByteBuffer;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.equalTo;
+import static org.hamcrest.Matchers.instanceOf;
+
+/** Tests for {@link GlueSchemaRegistryInputStreamDeserializer}. */
+public class GlueSchemaRegistryInputStreamDeserializerTest {
+    private static final String testTopic = "Test-Topic";
+    private static final UUID USER_SCHEMA_VERSION_ID = UUID.randomUUID();
+    private static final String AVRO_USER_SCHEMA_FILE = "src/test/java/resources/avro/user.avsc";
+    private static byte compressionByte;
+    private static Schema userSchema;
+    private static com.amazonaws.services.schemaregistry.common.Schema glueSchema;
+    private static User userDefinedPojo;
+    private static Map<String, Object> configs = new HashMap<>();
+    private static Map<String, String> metadata = new HashMap<>();
+    private static AWSCompressionHandler awsCompressionHandler;
+    private static AwsCredentialsProvider credentialsProvider =
+            DefaultCredentialsProvider.builder().build();
+    @Rule public ExpectedException thrown = ExpectedException.none();
+    private AWSDeserializer mockDeserializer;
+
+    @Before

Review comment:
       Updated some of test classes with `BeforeClass`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d6f06f07e0895117b345b99533ba7eda672ba765 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #14737:
URL: https://github.com/apache/flink/pull/14737#discussion_r577886982



##########
File path: flink-formats/flink-avro-glue-schema-registry/src/test/java/org/apache/flink/formats/avro/glue/schema/registry/User.java
##########
@@ -0,0 +1,434 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.glue.schema.registry;
+
+import org.apache.avro.message.BinaryMessageDecoder;
+import org.apache.avro.message.BinaryMessageEncoder;
+import org.apache.avro.message.SchemaStore;
+import org.apache.avro.specific.SpecificData;
+
+@SuppressWarnings("all")
+@org.apache.avro.specific.AvroGenerated
+public class User extends org.apache.avro.specific.SpecificRecordBase

Review comment:
       Do we actually need this generated file to be in the source, or could we just generate it on the fly like we do for flink-avro?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * 81adc9cccd9fc0247e58ad7688252e1d98382cc5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #14737: [FLINK-19667] Add AWS Glue Schema Registry integration

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #14737:
URL: https://github.com/apache/flink/pull/14737#issuecomment-766281483


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12417",
       "triggerID" : "399f06e14079a512d35814508b6f7598d7d175ba",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bd617892bfec1db1654606355041a6e4b9050304",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12419",
       "triggerID" : "bd617892bfec1db1654606355041a6e4b9050304",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=12489",
       "triggerID" : "d6f06f07e0895117b345b99533ba7eda672ba765",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13395",
       "triggerID" : "0cf159b0cfbf5578482ea2bb5751ca0cdc72ea48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13397",
       "triggerID" : "86353afe862e41b51dc48caf19f48fc03e6246b0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e05b89cd87ce4ba54de32eff3c68985f766180f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13401",
       "triggerID" : "81adc9cccd9fc0247e58ad7688252e1d98382cc5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "56566c305fba167267cf427dddbf33c62b04f997",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13426",
       "triggerID" : "56566c305fba167267cf427dddbf33c62b04f997",
       "triggerType" : "PUSH"
     }, {
       "hash" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13794",
       "triggerID" : "061978aeb474636d954f85b0408b00a21f2571e5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13944",
       "triggerID" : "46aa624c7b3fc5f65dcc85f9a526f2eb11127e3a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f793772656ad942f463a23b0dd43f3522f147493",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=13953",
       "triggerID" : "f793772656ad942f463a23b0dd43f3522f147493",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14148",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "fe824ddfb68b9efcf9f25c2555f1d08a547c2c5f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14112",
       "triggerID" : "790802979",
       "triggerType" : "MANUAL"
     }, {
       "hash" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153",
       "triggerID" : "c77e3d4a1a7484c4f1e24b23a1099364b834cf75",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e05b89cd87ce4ba54de32eff3c68985f766180f UNKNOWN
   * c77e3d4a1a7484c4f1e24b23a1099364b834cf75 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=14153) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org