You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2020/07/17 07:18:55 UTC

[GitHub] [flink] danny0405 opened a new pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

danny0405 opened a new pull request #12919:
URL: https://github.com/apache/flink/pull/12919


   … data from Kafka
   
   ## What is the purpose of the change
   
   Supports read/write with SQL using schema registry avro format.
   
   **The format details**
   
   _The factory identifier (or format id)_
   
   There are 2 candidates now ~
   
   `avro-sr`: the pattern borrowed from KSQL JSON_SR format [1]
   `avro-confluent`: the pattern borrowed from Clickhouse AvroConfluent [2]
   
   Personally i would prefer avro-sr because it is more concise and the confluent is a company name which i think is not that suitable for a format name.
   
   **The format attributes**
   
   Options | required | Remark
   -- | -- | --
   schema-string | true | avro schema string used for (de)serialization
   schema-registry.url | true | URL to connect to schema registry service
   schema-registry.subject | false | Subject name to write to the Schema Registry service, required for sink
   
   
   ## Brief change log
   
     - Add `avro-sr` read/write row data format
     - Add tests
   
   
   ## Verifying this change
   
   Added tests.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): no
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: no
     - The serializers: no
     - The runtime per-record code paths (performance sensitive): no
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: no
     - The S3 file system connector: (yes / no / don't know) no
   
   ## Documentation
   
     - Does this pull request introduce a new feature? yes
     - If yes, how is the feature documented? not documented
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 87d74c99c293fba3090208d89f687be7dbe3f3ab Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622) 
   * 48d837f15c74d134f2ba8b8e8f7ea27e6b62299f UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458684350



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroRowDataSerializationSchema.java
##########
@@ -64,61 +41,59 @@
 
 	private static final long serialVersionUID = 1L;
 
+	/** Nested schema to serialize the {@link GenericRecord} into bytes. **/
+	private final SerializationSchema<GenericRecord> nestedSchema;
+
 	/**
 	 * Logical type describing the input type.
 	 */
 	private final RowType rowType;
 
-	/**
-	 * Runtime instance that performs the actual work.
-	 */
-	private final SerializationRuntimeConverter runtimeConverter;
-
 	/**
 	 * Avro serialization schema.
 	 */
 	private transient Schema schema;
 
 	/**
-	 * Writer to serialize Avro record into a Avro bytes.
-	 */
-	private transient DatumWriter<IndexedRecord> datumWriter;
-
-	/**
-	 * Output stream to serialize records into byte array.
+	 * Runtime instance that performs the actual work.
 	 */
-	private transient ByteArrayOutputStream arrayOutputStream;
+	private final RowDataToAvroConverters.RowDataToAvroConverter runtimeConverter;
 
 	/**
-	 * Low-level class for serialization of Avro values.
+	 * Creates an Avro serialization schema with the given record row type.
 	 */
-	private transient Encoder encoder;
+	public AvroRowDataSerializationSchema(RowType rowType) {

Review comment:
       Why?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 9d8870894b4d9d434c45b58339985aed3b76a8be Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 48d837f15c74d134f2ba8b8e8f7ea27e6b62299f Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627) 
   * d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 810321d988a8284eb54c2963f22a049dc06ac8aa Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] sjwiesman commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
sjwiesman commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-660174388


   +1 for SR and please tag me when you open a documentation PR for this feature


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6839c54eedcdca926b8304782fabcb0dc529c5a6 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6839c54eedcdca926b8304782fabcb0dc529c5a6 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-716368991


   Thanks @danny0405 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r503798223



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
##########
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+org.apache.flink.formats.avro.registry.confluent.RegistryAvroFormatFactory

Review comment:
       It is not available, because it will be included in the 1.12 version.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 810321d988a8284eb54c2963f22a049dc06ac8aa UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6839c54eedcdca926b8304782fabcb0dc529c5a6 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715009216


   @danny0405 
   Problem is that with this strategy I'm unable to read anything from Kafka using Confluent Registry. Example:
   I have data in Kafka with following value schema:
   ```
   {
     "type": "record",
     "name": "myrecord",
     "fields": [
       {
         "name": "f1",
         "type": "string"
       }
     ]
   }
   ```
   I'm creating table using this avro-confluent format:
   ```
   create table `test` (
   	`f1` STRING
   ) WITH (
     'connector' = 'kafka', 
     'topic' = 'test', 
     'properties.bootstrap.servers' = 'localhost:9092', 
     'properties.group.id' = 'test1234', 
      'scan.startup.mode' = 'earliest-offset', 
     'format' = 'avro-confluent'
     'avro-confluent.schema-registry.url' = 'http://localhost:8081'
   );
   ```
   When trying to select data I'm getting error:
   ```
   SELECT * FROM test;
   [ERROR] Could not execute SQL statement. Reason:
   org.apache.avro.AvroTypeException: Found myrecord, expecting record, missing required field record_f1
   ```
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 87d74c99c293fba3090208d89f687be7dbe3f3ab Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622) 
   * 48d837f15c74d134f2ba8b8e8f7ea27e6b62299f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715009216


   @danny0405 
   Problem is this is not compatible. I'm unable to read anything from Kafka using Confluent Registry. Example:
   I have data in Kafka with following value schema:
   ```
   {
     "type": "record",
     "name": "myrecord",
     "fields": [
       {
         "name": "f1",
         "type": "string"
       }
     ]
   }
   ```
   I'm creating table using this avro-confluent format:
   ```
   create table `test` (
   	`f1` STRING
   ) WITH (
     'connector' = 'kafka', 
     'topic' = 'test', 
     'properties.bootstrap.servers' = 'localhost:9092', 
     'properties.group.id' = 'test1234', 
      'scan.startup.mode' = 'earliest-offset', 
     'format' = 'avro-confluent'
     'avro-confluent.schema-registry.url' = 'http://localhost:8081'
   );
   ```
   When trying to select data I'm getting error:
   ```
   SELECT * FROM test;
   [ERROR] Could not execute SQL statement. Reason:
   org.apache.avro.AvroTypeException: Found myrecord, expecting record, missing required field record_f1
   ```
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-714693831


   @danny0405 
   Any reasons all the fields read and written by this format has prefix 'record_' ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r459273734



##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/RegistryAvroRowDataSeDeSchemaTest.java
##########
@@ -0,0 +1,129 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro;
+
+import org.apache.flink.formats.avro.generated.Address;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.formats.avro.utils.TestDataGenerator;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Test;
+
+import java.util.Random;
+
+import static org.apache.flink.formats.avro.utils.AvroTestUtils.writeRecord;
+import static org.hamcrest.core.Is.is;
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertThat;
+
+/**
+ * Tests for {@link AvroRowDataDeserializationSchema} and
+ * {@link AvroRowDataSerializationSchema} for schema registry avro.
+ */
+public class RegistryAvroRowDataSeDeSchemaTest {
+	private static final String ADDRESS_SCHEMA = "" +

Review comment:
       I have added a schema registry server there, and test the new format SE/DE with connection to the schema registry service. See the new `RegistryAvroRowDataSeDeSchemaTest`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715222930


   @danny0405 
   I think we have one more problem.
   When Flink is creating schema in registry nullability is not properly set for logical types.
   Examples. Table:
   ```
   create table `test_logical_null` (
   	`string_field` STRING,
   	`timestamp_field` TIMESTAMP(3)
   ) WITH (
     'connector' = 'kafka', 
     'topic' = 'test-logical-null', 
     'properties.bootstrap.servers' = 'localhost:9092', 
     'properties.group.id' = 'test12345', 
      'scan.startup.mode' = 'earliest-offset', 
     'format' = 'avro-confluent', -- Must be set to 'avro-confluent' to configure this format.
     'avro-confluent.schema-registry.url' = 'http://localhost:8081', -- URL to connect to Confluent Schema Registry
     'avro-confluent.schema-registry.subject' = 'test-logical-null' -- Subject name to write to the Schema Registry service; required for sinks
   )
   ```
   Schema:
   ```
   {
     "type": "record",
     "name": "record",
     "fields": [
       {
         "name": "string_field",
         "type": [
           "string",
           "null"
         ]
       },
       {
         "name": "timestamp_field",
         "type": {
           "type": "long",
           "logicalType": "timestamp-millis"
         }
       }
     ]
   }
   ```
   For not null fields:
   ```
   create table `test_logical_notnull` (
   	`string_field` STRING NOT NULL,
   	`timestamp_field` TIMESTAMP(3) NOT NULL
   ) WITH (
     'connector' = 'kafka', 
     'topic' = 'test-logical-notnull', 
     'properties.bootstrap.servers' = 'localhost:9092', 
     'properties.group.id' = 'test12345', 
      'scan.startup.mode' = 'earliest-offset', 
     'format' = 'avro-confluent', -- Must be set to 'avro-confluent' to configure this format.
     'avro-confluent.schema-registry.url' = 'http://localhost:8081', -- URL to connect to Confluent Schema Registry
     'avro-confluent.schema-registry.subject' = 'test-logical-notnull-value' -- Subject name to write to the Schema Registry service; required for sinks
   );
   ```
   Schema
   ```
   {
     "type": "record",
     "name": "record",
     "fields": [
       {
         "name": "string_field",
         "type": "string"
       },
       {
         "name": "timestamp_field",
         "type": {
           "type": "long",
           "logicalType": "timestamp-millis"
         }
       }
     ]
   }
   ```
   As you can see for string_field we have proper union with null (for nullable field). For timestamp_field in both examples union is missing.
   
   EDIT: I added bug report for this
   https://issues.apache.org/jira/browse/FLINK-19786


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] sjwiesman commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
sjwiesman commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r456517725



##########
File path: flink-formats/flink-avro-confluent-registry/pom.xml
##########
@@ -30,7 +30,7 @@ under the License.
 	<artifactId>flink-avro-confluent-registry</artifactId>
 
 	<properties>
-		<confluent.schema.registry.version>4.1.0</confluent.schema.registry.version>
+		<confluent.schema.registry.version>5.5.1</confluent.schema.registry.version>

Review comment:
       Update the version in the License notice file




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-714693831


   @danny0405 @dawidwys 
   Any reasons all the fields read and written by this format has prefix 'record_' ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6839c54eedcdca926b8304782fabcb0dc529c5a6 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847) 
   * 9d8870894b4d9d434c45b58339985aed3b76a8be UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875",
       "triggerID" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6bd8c02de778a8ff2f34a19d5beee414beac3f69 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875",
       "triggerID" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 810321d988a8284eb54c2963f22a049dc06ac8aa Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863) 
   * 6bd8c02de778a8ff2f34a19d5beee414beac3f69 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc53f692ab7edf110e6c7c39202e50ed1ec0c05d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4856",
       "triggerID" : "bc53f692ab7edf110e6c7c39202e50ed1ec0c05d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0d49193a68ad28f6d2965bc8bfd57c59b2b21935",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0d49193a68ad28f6d2965bc8bfd57c59b2b21935",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * bc53f692ab7edf110e6c7c39202e50ed1ec0c05d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4856) 
   * 0d49193a68ad28f6d2965bc8bfd57c59b2b21935 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] homepy commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
homepy commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r503817414



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
##########
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+org.apache.flink.formats.avro.registry.confluent.RegistryAvroFormatFactory

Review comment:
       Thanks. 
   I found it...
   I need to use the master branch now...




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys merged pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
dawidwys merged pull request #12919:
URL: https://github.com/apache/flink/pull/12919


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-716375599


   I have updated the fix, @maver1ck , please check if you have time.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-714693831


   @danny0405 @dawidwys 
   Any reasons all the fields read and written by this format has prefix 'record_' ? (I'm using flink sql for this client)
   I found responsible code probably here but still have problem with this solution:
   https://github.com/apache/flink/blob/de87a2debde8546e6741390a81f43c032521c3c0/flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java#L365


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 810321d988a8284eb54c2963f22a049dc06ac8aa Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863) 
   * 6bd8c02de778a8ff2f34a19d5beee414beac3f69 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-716764210


   @danny0405 I see code review is still in progress.
   Could you please let me know when this PR would be polished a little ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * edb952c0f8ae4394b7f5238f4fea39878106a775 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667) 
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 48d837f15c74d134f2ba8b8e8f7ea27e6b62299f Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627) 
   * d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-662792122


   > Would `flink-avro-confluent` be a better module name than `flink-avro-confluent-registry`? IMO `registry` has nothing to do with the format itself.
   
   It depends on how we understand it, the confluent avro format is mainly designed for schema registry.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc53f692ab7edf110e6c7c39202e50ed1ec0c05d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4856",
       "triggerID" : "bc53f692ab7edf110e6c7c39202e50ed1ec0c05d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0d49193a68ad28f6d2965bc8bfd57c59b2b21935",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0d49193a68ad28f6d2965bc8bfd57c59b2b21935",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 9d8870894b4d9d434c45b58339985aed3b76a8be Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851) 
   * bc53f692ab7edf110e6c7c39202e50ed1ec0c05d Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4856) 
   * 0d49193a68ad28f6d2965bc8bfd57c59b2b21935 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-660392881


   > +1 for SR and please tag me when you open a documentation PR for this feature
   
   Sure, thanks for taking care the document.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458664624



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
##########
@@ -0,0 +1,76 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
+
+import javax.annotation.Nullable;
+
+import java.util.Objects;
+
+/** A {@link SchemaCoder.SchemaCoderProvider} that uses a cached schema registry
+ * client underlying. **/
+@Internal
+class CachedSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {

Review comment:
       Personally i prefer normal interface than static inner one, and also we can reuse this data structure.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659917403


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit bac18583fd0ba4855eebd76409198e1fb3fc3314 (Fri Jul 17 07:20:46 UTC 2020)
   
   **Warnings:**
    * **1 pom.xml files were touched**: Check for build and licensing issues.
    * No documentation files were touched! Remember to keep the Flink docs up to date!
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-714693831


   @danny0405 @dawidwys 
   Any reasons all the fields read and written by this format has prefix 'record_' ? (I'm using flink sql for this client)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bac18583fd0ba4855eebd76409198e1fb3fc3314 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458668739



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
##########
@@ -0,0 +1,76 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
+
+import javax.annotation.Nullable;
+
+import java.util.Objects;
+
+/** A {@link SchemaCoder.SchemaCoderProvider} that uses a cached schema registry
+ * client underlying. **/
+@Internal
+class CachedSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {

Review comment:
       Where can we reuse it?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bac18583fd0ba4855eebd76409198e1fb3fc3314 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 87d74c99c293fba3090208d89f687be7dbe3f3ab Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875",
       "triggerID" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "triggerType" : "PUSH"
     }, {
       "hash" : "52518eecfce65f5adceda689fa720f15c85413b6",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4902",
       "triggerID" : "52518eecfce65f5adceda689fa720f15c85413b6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6bd8c02de778a8ff2f34a19d5beee414beac3f69 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875) 
   * 52518eecfce65f5adceda689fa720f15c85413b6 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4902) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715222930


   @danny0405 
   I think we have one more problem.
   When Flink is creating schema in registry nullability is not properly set for logical types.
   Examples. Table:
   ```
   create table `test_logical_null` (
   	`string_field` STRING,
   	`timestamp_field` TIMESTAMP(3)
   ) WITH (
     'connector' = 'kafka', 
     'topic' = 'test-logical-null', 
     'properties.bootstrap.servers' = 'localhost:9092', 
     'properties.group.id' = 'test12345', 
      'scan.startup.mode' = 'earliest-offset', 
     'format' = 'avro-confluent', -- Must be set to 'avro-confluent' to configure this format.
     'avro-confluent.schema-registry.url' = 'http://localhost:8081', -- URL to connect to Confluent Schema Registry
     'avro-confluent.schema-registry.subject' = 'test-logical-null' -- Subject name to write to the Schema Registry service; required for sinks
   )
   ```
   Schema:
   ```
   {
     "type": "record",
     "name": "record",
     "fields": [
       {
         "name": "string_field",
         "type": [
           "string",
           "null"
         ]
       },
       {
         "name": "timestamp_field",
         "type": {
           "type": "long",
           "logicalType": "timestamp-millis"
         }
       }
     ]
   }
   ```
   For not null fields:
   ```
   create table `test_logical_notnull` (
   	`string_field` STRING NOT NULL,
   	`timestamp_field` TIMESTAMP(3) NOT NULL
   ) WITH (
     'connector' = 'kafka', 
     'topic' = 'test-logical-notnull', 
     'properties.bootstrap.servers' = 'localhost:9092', 
     'properties.group.id' = 'test12345', 
      'scan.startup.mode' = 'earliest-offset', 
     'format' = 'avro-confluent', -- Must be set to 'avro-confluent' to configure this format.
     'avro-confluent.schema-registry.url' = 'http://localhost:8081', -- URL to connect to Confluent Schema Registry
     'avro-confluent.schema-registry.subject' = 'test-logical-notnull-value' -- Subject name to write to the Schema Registry service; required for sinks
   );
   ```
   Schema
   ```
   {
     "type": "record",
     "name": "record",
     "fields": [
       {
         "name": "string_field",
         "type": "string"
       },
       {
         "name": "timestamp_field",
         "type": {
           "type": "long",
           "logicalType": "timestamp-millis"
         }
       }
     ]
   }
   ```
   As you can see for string_field we have proper union with null (for nullable field). For timestamp_field in both examples union is missing.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715206532


   OK. It's working. I'm able to read data.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458670981



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroRowDataSerializationSchema.java
##########
@@ -64,61 +41,59 @@
 
 	private static final long serialVersionUID = 1L;
 
+	/** Nested schema to serialize the {@link GenericRecord} into bytes. **/
+	private final SerializationSchema<GenericRecord> nestedSchema;
+
 	/**
 	 * Logical type describing the input type.
 	 */
 	private final RowType rowType;
 
-	/**
-	 * Runtime instance that performs the actual work.
-	 */
-	private final SerializationRuntimeConverter runtimeConverter;
-
 	/**
 	 * Avro serialization schema.
 	 */
 	private transient Schema schema;
 
 	/**
-	 * Writer to serialize Avro record into a Avro bytes.
-	 */
-	private transient DatumWriter<IndexedRecord> datumWriter;
-
-	/**
-	 * Output stream to serialize records into byte array.
+	 * Runtime instance that performs the actual work.
 	 */
-	private transient ByteArrayOutputStream arrayOutputStream;
+	private final RowDataToAvroConverters.RowDataToAvroConverter runtimeConverter;
 
 	/**
-	 * Low-level class for serialization of Avro values.
+	 * Creates an Avro serialization schema with the given record row type.
 	 */
-	private transient Encoder encoder;
+	public AvroRowDataSerializationSchema(RowType rowType) {

Review comment:
       I would prefer a constructor with default implementation.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * edb952c0f8ae4394b7f5238f4fea39878106a775 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667) 
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715143420


   I will check... mvn is running


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458585321



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
##########
@@ -0,0 +1,76 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
+
+import javax.annotation.Nullable;
+
+import java.util.Objects;
+
+/** A {@link SchemaCoder.SchemaCoderProvider} that uses a cached schema registry
+ * client underlying. **/
+@Internal
+class CachedSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {

Review comment:
       Do we still need to extract this class after the latest changes?

##########
File path: flink-formats/flink-avro/pom.xml
##########
@@ -140,6 +132,14 @@ under the License.
 			<scope>test</scope>
 		</dependency>
 
+		<!-- Avro RowData schema test dependency -->

Review comment:
       Unnecessary change.

##########
File path: flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/ConfluentRegistryAvroSerializationSchema.java
##########
@@ -92,25 +91,4 @@ private ConfluentRegistryAvroSerializationSchema(Class<T> recordClazz, Schema sc
 			new CachedSchemaCoderProvider(subject, schemaRegistryUrl, DEFAULT_IDENTITY_MAP_CAPACITY)
 		);
 	}
-
-	private static class CachedSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {

Review comment:
       ditto

##########
File path: flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.api.common.serialization.DeserializationSchema;
+import org.apache.flink.api.common.serialization.SerializationSchema;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.formats.avro.AvroRowDataDeserializationSchema;
+import org.apache.flink.formats.avro.AvroRowDataSerializationSchema;
+import org.apache.flink.formats.avro.AvroToRowDataConverters;
+import org.apache.flink.formats.avro.RowDataToAvroConverters;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogTableImpl;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.connector.sink.DynamicTableSink;
+import org.apache.flink.table.connector.source.DynamicTableSource;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.factories.FactoryUtil;
+import org.apache.flink.table.factories.TestDynamicTableFactory;
+import org.apache.flink.table.runtime.connector.source.ScanRuntimeProviderContext;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Consumer;
+
+import static org.apache.flink.core.testutils.FlinkMatchers.containsCause;
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Tests for the {@link RegistryAvroFormatFactory}.
+ */
+public class RegistryAvroFormatFactoryTest {
+	private TableSchema schema;
+	private RowType rowType;
+	private String subject;
+	private String registryURL;
+
+	@Rule
+	public ExpectedException thrown = ExpectedException.none();
+
+	@Before
+	public void before() {
+		this.schema = TableSchema.builder()
+				.field("a", DataTypes.STRING())
+				.field("b", DataTypes.INT())
+				.field("c", DataTypes.BOOLEAN())
+				.build();
+		this.rowType = (RowType) schema.toRowDataType().getLogicalType();
+		this.subject = "test-subject";
+		this.registryURL = "http://localhost:8081";
+	}
+
+	@Test
+	public void testSeDeSchema() {

Review comment:
       You have two completely independent tests in the single method. Please split it into two separate tests. We should always aim to test a single thing at a time. The benefits are:
   1. Both tests are always executed. Independent of the result of the other.
   2. It's easier to debug. You don't need to run the first case if the second fails.

##########
File path: flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.api.common.serialization.DeserializationSchema;
+import org.apache.flink.api.common.serialization.SerializationSchema;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.formats.avro.AvroRowDataDeserializationSchema;
+import org.apache.flink.formats.avro.AvroRowDataSerializationSchema;
+import org.apache.flink.formats.avro.AvroToRowDataConverters;
+import org.apache.flink.formats.avro.RowDataToAvroConverters;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogTableImpl;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.connector.sink.DynamicTableSink;
+import org.apache.flink.table.connector.source.DynamicTableSource;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.factories.FactoryUtil;
+import org.apache.flink.table.factories.TestDynamicTableFactory;
+import org.apache.flink.table.runtime.connector.source.ScanRuntimeProviderContext;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Consumer;
+
+import static org.apache.flink.core.testutils.FlinkMatchers.containsCause;
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Tests for the {@link RegistryAvroFormatFactory}.
+ */
+public class RegistryAvroFormatFactoryTest {
+	private TableSchema schema;
+	private RowType rowType;
+	private String subject;
+	private String registryURL;
+
+	@Rule
+	public ExpectedException thrown = ExpectedException.none();
+
+	@Before
+	public void before() {
+		this.schema = TableSchema.builder()
+				.field("a", DataTypes.STRING())
+				.field("b", DataTypes.INT())
+				.field("c", DataTypes.BOOLEAN())
+				.build();
+		this.rowType = (RowType) schema.toRowDataType().getLogicalType();
+		this.subject = "test-subject";
+		this.registryURL = "http://localhost:8081";
+	}
+
+	@Test
+	public void testSeDeSchema() {
+		final AvroRowDataDeserializationSchema expectedDeser =
+				new AvroRowDataDeserializationSchema(
+						ConfluentRegistryAvroDeserializationSchema.forGeneric(
+								AvroSchemaConverter.convertToSchema(rowType),
+								registryURL),
+						AvroToRowDataConverters.createRowConverter(rowType),
+						InternalTypeInfo.of(rowType));
+
+		final Map<String, String> options = getAllOptions();
+
+		final DynamicTableSource actualSource = createTableSource(options);
+		assert actualSource instanceof TestDynamicTableFactory.DynamicTableSourceMock;
+		TestDynamicTableFactory.DynamicTableSourceMock scanSourceMock =
+				(TestDynamicTableFactory.DynamicTableSourceMock) actualSource;
+
+		DeserializationSchema<RowData> actualDeser = scanSourceMock.valueFormat
+				.createRuntimeDecoder(
+						ScanRuntimeProviderContext.INSTANCE,
+						schema.toRowDataType());
+
+		assertEquals(expectedDeser, actualDeser);
+
+		final AvroRowDataSerializationSchema expectedSer =
+				new AvroRowDataSerializationSchema(
+						rowType,
+						ConfluentRegistryAvroSerializationSchema.forGeneric(
+								subject,
+								AvroSchemaConverter.convertToSchema(rowType),
+								registryURL),
+						RowDataToAvroConverters.createRowConverter(rowType));
+
+		final DynamicTableSink actualSink = createTableSink(options);
+		assert actualSink instanceof TestDynamicTableFactory.DynamicTableSinkMock;
+		TestDynamicTableFactory.DynamicTableSinkMock sinkMock =
+				(TestDynamicTableFactory.DynamicTableSinkMock) actualSink;
+
+		SerializationSchema<RowData> actualSer = sinkMock.valueFormat
+				.createRuntimeEncoder(
+						null,
+						schema.toRowDataType());
+
+		assertEquals(expectedSer, actualSer);
+	}
+
+	@Test
+	public void testMissingSubjectForSink() {
+		thrown.expect(ValidationException.class);
+		thrown.expect(
+				containsCause(
+						new ValidationException("Option avro-sr.schema-registry.subject "
+								+ "is required for serialization")));
+
+		final Map<String, String> options =
+				getModifiedOptions(opts -> opts.remove("avro-sr.schema-registry.subject"));
+
+		createTableSink(options);
+	}
+
+	// ------------------------------------------------------------------------
+	//  Utilities
+	// ------------------------------------------------------------------------
+
+	/**
+	 * Returns the full options modified by the given consumer {@code optionModifier}.
+	 *
+	 * @param optionModifier Consumer to modify the options
+	 */
+	private Map<String, String> getModifiedOptions(Consumer<Map<String, String>> optionModifier) {
+		Map<String, String> options = getAllOptions();
+		optionModifier.accept(options);
+		return options;
+	}
+
+	private Map<String, String> getAllOptions() {

Review comment:
       `getAllOptions` -> `getDefaultOptions`

##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroRowDataSerializationSchema.java
##########
@@ -64,61 +41,59 @@
 
 	private static final long serialVersionUID = 1L;
 
+	/** Nested schema to serialize the {@link GenericRecord} into bytes. **/
+	private final SerializationSchema<GenericRecord> nestedSchema;
+
 	/**
 	 * Logical type describing the input type.
 	 */
 	private final RowType rowType;
 
-	/**
-	 * Runtime instance that performs the actual work.
-	 */
-	private final SerializationRuntimeConverter runtimeConverter;
-
 	/**
 	 * Avro serialization schema.
 	 */
 	private transient Schema schema;
 
 	/**
-	 * Writer to serialize Avro record into a Avro bytes.
-	 */
-	private transient DatumWriter<IndexedRecord> datumWriter;
-
-	/**
-	 * Output stream to serialize records into byte array.
+	 * Runtime instance that performs the actual work.
 	 */
-	private transient ByteArrayOutputStream arrayOutputStream;
+	private final RowDataToAvroConverters.RowDataToAvroConverter runtimeConverter;
 
 	/**
-	 * Low-level class for serialization of Avro values.
+	 * Creates an Avro serialization schema with the given record row type.
 	 */
-	private transient Encoder encoder;
+	public AvroRowDataSerializationSchema(RowType rowType) {

Review comment:
       How about we remove this ctor? IMO the logic from this ctor should be only in the factory. I know that this class in theory is `PublicEvolving` but practically it is only usable from Table API through the factory. Therefore in my opinion it is safe to drop this ctor.
   
   The same applies to `SerializationSchema`.

##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/RegistryAvroRowDataSeDeSchemaTest.java
##########
@@ -0,0 +1,129 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro;
+
+import org.apache.flink.formats.avro.generated.Address;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.formats.avro.utils.TestDataGenerator;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Test;
+
+import java.util.Random;
+
+import static org.apache.flink.formats.avro.utils.AvroTestUtils.writeRecord;
+import static org.hamcrest.core.Is.is;
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertThat;
+
+/**
+ * Tests for {@link AvroRowDataDeserializationSchema} and
+ * {@link AvroRowDataSerializationSchema} for schema registry avro.
+ */
+public class RegistryAvroRowDataSeDeSchemaTest {
+	private static final String ADDRESS_SCHEMA = "" +

Review comment:
       IMO we should add a simple test for serializing and deserializing using schema registry. It does not need to be very in depth, but  so that it checks that everything is well connected.

##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroRowDataDeserializationSchema.java
##########
@@ -75,17 +44,10 @@
 @PublicEvolving
 public class AvroRowDataDeserializationSchema implements DeserializationSchema<RowData> {
 
-	private static final long serialVersionUID = 1L;
-
-	/**
-	 * Used for converting Date type.
-	 */
-	private static final int MILLIS_PER_DAY = 86400_000;
+	private static final long serialVersionUID = 9055890466043022732L;

Review comment:
       There is no point in using a large number here. Use  `2L` here.

##########
File path: flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/ConfluentRegistryAvroDeserializationSchema.java
##########
@@ -114,23 +113,4 @@ private ConfluentRegistryAvroDeserializationSchema(Class<T> recordClazz, @Nullab
 			new CachedSchemaCoderProvider(url, identityMapCapacity)
 		);
 	}
-
-	private static class CachedSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {

Review comment:
       ditto

##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroToRowDataConverters.java
##########
@@ -0,0 +1,246 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro;
+
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.data.DecimalData;
+import org.apache.flink.table.data.GenericArrayData;
+import org.apache.flink.table.data.GenericMapData;
+import org.apache.flink.table.data.GenericRowData;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.data.StringData;
+import org.apache.flink.table.data.TimestampData;
+import org.apache.flink.table.types.logical.ArrayType;
+import org.apache.flink.table.types.logical.DecimalType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.RowType;
+import org.apache.flink.table.types.logical.utils.LogicalTypeUtils;
+
+import org.apache.avro.generic.GenericFixed;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.avro.generic.IndexedRecord;
+import org.joda.time.DateTime;
+import org.joda.time.DateTimeFieldType;
+import org.joda.time.LocalDate;
+
+import java.io.Serializable;
+import java.lang.reflect.Array;
+import java.nio.ByteBuffer;
+import java.sql.Timestamp;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.apache.flink.formats.avro.typeutils.AvroSchemaConverter.extractValueTypeToAvroMap;
+import static org.joda.time.DateTimeConstants.MILLIS_PER_DAY;
+
+/** Tool class used to convert from Avro {@link GenericRecord} to {@link RowData}. **/
+public class AvroToRowDataConverters {

Review comment:
       `@Internal`

##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/RegistryAvroRowDataSeDeSchemaTest.java
##########
@@ -0,0 +1,129 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro;
+
+import org.apache.flink.formats.avro.generated.Address;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.formats.avro.utils.TestDataGenerator;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Test;
+
+import java.util.Random;
+
+import static org.apache.flink.formats.avro.utils.AvroTestUtils.writeRecord;
+import static org.hamcrest.core.Is.is;
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertThat;
+
+/**
+ * Tests for {@link AvroRowDataDeserializationSchema} and
+ * {@link AvroRowDataSerializationSchema} for schema registry avro.
+ */
+public class RegistryAvroRowDataSeDeSchemaTest {

Review comment:
       Those tests have nothing to do with schema registry.
   
   They test the same logic as in `AvroRowDataDeSerializationSchemaTest`

##########
File path: flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.api.common.serialization.DeserializationSchema;
+import org.apache.flink.api.common.serialization.SerializationSchema;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.formats.avro.AvroRowDataDeserializationSchema;
+import org.apache.flink.formats.avro.AvroRowDataSerializationSchema;
+import org.apache.flink.formats.avro.AvroToRowDataConverters;
+import org.apache.flink.formats.avro.RowDataToAvroConverters;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogTableImpl;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.connector.sink.DynamicTableSink;
+import org.apache.flink.table.connector.source.DynamicTableSource;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.factories.FactoryUtil;
+import org.apache.flink.table.factories.TestDynamicTableFactory;
+import org.apache.flink.table.runtime.connector.source.ScanRuntimeProviderContext;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Consumer;
+
+import static org.apache.flink.core.testutils.FlinkMatchers.containsCause;
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Tests for the {@link RegistryAvroFormatFactory}.
+ */
+public class RegistryAvroFormatFactoryTest {
+	private TableSchema schema;
+	private RowType rowType;
+	private String subject;
+	private String registryURL;
+
+	@Rule
+	public ExpectedException thrown = ExpectedException.none();
+
+	@Before
+	public void before() {

Review comment:
       Why do we need that in the `@Before` block? Can't we just initialize it statically?

##########
File path: flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.api.common.serialization.DeserializationSchema;
+import org.apache.flink.api.common.serialization.SerializationSchema;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.formats.avro.AvroRowDataDeserializationSchema;
+import org.apache.flink.formats.avro.AvroRowDataSerializationSchema;
+import org.apache.flink.formats.avro.AvroToRowDataConverters;
+import org.apache.flink.formats.avro.RowDataToAvroConverters;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogTableImpl;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.connector.sink.DynamicTableSink;
+import org.apache.flink.table.connector.source.DynamicTableSource;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.factories.FactoryUtil;
+import org.apache.flink.table.factories.TestDynamicTableFactory;
+import org.apache.flink.table.runtime.connector.source.ScanRuntimeProviderContext;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Consumer;
+
+import static org.apache.flink.core.testutils.FlinkMatchers.containsCause;
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Tests for the {@link RegistryAvroFormatFactory}.
+ */
+public class RegistryAvroFormatFactoryTest {
+	private TableSchema schema;
+	private RowType rowType;
+	private String subject;
+	private String registryURL;
+
+	@Rule
+	public ExpectedException thrown = ExpectedException.none();
+
+	@Before
+	public void before() {
+		this.schema = TableSchema.builder()
+				.field("a", DataTypes.STRING())
+				.field("b", DataTypes.INT())
+				.field("c", DataTypes.BOOLEAN())
+				.build();
+		this.rowType = (RowType) schema.toRowDataType().getLogicalType();
+		this.subject = "test-subject";
+		this.registryURL = "http://localhost:8081";
+	}
+
+	@Test
+	public void testSeDeSchema() {
+		final AvroRowDataDeserializationSchema expectedDeser =
+				new AvroRowDataDeserializationSchema(
+						ConfluentRegistryAvroDeserializationSchema.forGeneric(
+								AvroSchemaConverter.convertToSchema(rowType),
+								registryURL),
+						AvroToRowDataConverters.createRowConverter(rowType),
+						InternalTypeInfo.of(rowType));
+
+		final Map<String, String> options = getAllOptions();
+
+		final DynamicTableSource actualSource = createTableSource(options);
+		assert actualSource instanceof TestDynamicTableFactory.DynamicTableSourceMock;

Review comment:
       Please don't use `assert`. I can't think of a reason to use an `assert` in a test. `assert` is an assertion you can disable via a compiler flag. Why would you want to disable assertions in tests? If you want to check the type of `actualSource` use e.g. 
   `assertThat(actualSink, instanceOf(TestDynamicTableFactory.DynamicTableSinkMock.class));`

##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -169,6 +171,121 @@ private AvroSchemaConverter() {
 		throw new IllegalArgumentException("Unsupported Avro type '" + schema.getType() + "'.");
 	}
 
+	/**
+	 * Converts an Avro schema string into a nested row structure with deterministic field order and data
+	 * types that are compatible with Flink's Table & SQL API.
+	 *
+	 * @param avroSchemaString Avro schema definition string
+	 *
+	 * @return data type matching the schema
+	 */
+	public static DataType convertToDataType(String avroSchemaString) {

Review comment:
       Could we add tests for this method?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-661830869


   > After a quick glimpse. Could we unify the `AvroRowDataDeserializationSchema` with `ConfluentRegistryAvroRowDataDeserializationSchema` and `RegistryAvroRowDataSerializationSchema`?
   > I really believe we need just a single AvroRowDeserializationSchema for avro for table API.
   > 
   > I am quite sure sth like this would work:
   > 
   > ```
   > RowDataDeserializationSchema extends ResultTypeQueryable {
   >   	private final DeserializationSchema<GenericRecord> nestedSchema;
   > 	private final DeserializationRuntimeConverter runtimeConverter;
   > 	private final TypeInformation<RowData> resultType;
   > 
   > 	public AvroRowDataDeserializationSchema2(
   > 			DeserializationSchema<GenericRecord> nestedSchema,
   > 			DeserializationRuntimeConverter runtimeConverter,
   > 			TypeInformation<RowData> resultType) {
   > 		this.nestedSchema = nestedSchema;
   > 		this.runtimeConverter = runtimeConverter;
   > 		this.resultType = resultType;
   > 	}
   > 
   > 	@Override
   > 	public void open(InitializationContext context) throws Exception {
   > 		nestedSchema.open(context);
   > 	}
   > 
   > 	@Override
   > 	public RowData deserialize(byte[] message) throws IOException {
   > 		try {
   > 			GenericRecord deserialize = nestedSchema.deserialize(message);
   > 			return (RowData) runtimeConverter.convert(deserialize);
   > 		} catch (Exception e) {
   > 			throw new IOException("Failed to deserialize Avro record.", e);
   > 		}
   > 	}
   > 
   >        	@Override
   > 	public boolean isEndOfStream(RowData nextElement) {
   > 		return false;
   > 	}
   > 
   > 	@Override
   > 	public TypeInformation<RowData> getProducedType() {
   > 		return resultType;
   > 	}
   > }
   > ```
   > 
   > and then you would use it like this:
   > 
   > in `AvroFormatFactory`:
   > 
   > ```
   > new RowDataDeserializationSchema(
   > 	AvroDeserializationSchema.forGeneric(AvroSchemaConverter.convertToSchema(rowType)),
   >         createRowConverter(rowType), // we would need to move this method to some utils class or to a common abstract class for factories
   > 	rowDataTypeInfo
   > );
   > ```
   > 
   > in `RegistryAvroFormatFactory`:
   > 
   > ```
   > new RowDataDeserializationSchema(
   > 	ConfluentRegistryAvroDeserializationSchema.forGeneric(
   > 		AvroSchemaConverter.convertToSchema(rowType),
   > 		schemaRegistryURL
   > 	),
   >         createRowConverter(rowType),
   > 	rowDataTypeInfo
   > );
   > ```
   
   Thanks for the nice review, i have addressed your comments.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bac18583fd0ba4855eebd76409198e1fb3fc3314 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r457214825



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
##########
@@ -0,0 +1,74 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
+
+import javax.annotation.Nullable;
+
+import java.util.Objects;
+
+/** A {@link SchemaCoder.SchemaCoderProvider} that uses a cached schema registry
+ * client underlying. **/
+public class CachedSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {

Review comment:
       default scope? + `@Internal`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637) 
   * 0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875",
       "triggerID" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "triggerType" : "PUSH"
     }, {
       "hash" : "52518eecfce65f5adceda689fa720f15c85413b6",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "52518eecfce65f5adceda689fa720f15c85413b6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6bd8c02de778a8ff2f34a19d5beee414beac3f69 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875) 
   * 52518eecfce65f5adceda689fa720f15c85413b6 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bac18583fd0ba4855eebd76409198e1fb3fc3314 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578) 
   * 87d74c99c293fba3090208d89f687be7dbe3f3ab UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * edb952c0f8ae4394b7f5238f4fea39878106a775 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] KurtYoung commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
KurtYoung commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-662787040


   Would `flink-avro-confluent` be a better module name than `flink-avro-confluent-registry`? IMO `registry` has nothing to do with the format itself.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637) 
   * 0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715024308


   Thanks, i think this is a bug, i have logged an issue there. See https://issues.apache.org/jira/browse/FLINK-19779


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875",
       "triggerID" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "triggerType" : "PUSH"
     }, {
       "hash" : "52518eecfce65f5adceda689fa720f15c85413b6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4902",
       "triggerID" : "52518eecfce65f5adceda689fa720f15c85413b6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 52518eecfce65f5adceda689fa720f15c85413b6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4902) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 48d837f15c74d134f2ba8b8e8f7ea27e6b62299f Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] homepy commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
homepy commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r503783829



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
##########
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+org.apache.flink.formats.avro.registry.confluent.RegistryAvroFormatFactory

Review comment:
       This file does not exist in the release jar (https://mvnrepository.com/artifact/org.apache.flink/flink-avro-confluent-registry/1.11.2), but some other things. 
   Without this flie, we could not use it in sql-client.sh...
   Maybe there is any mistake of the maven-shade-plugin config in pom.xml?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-714889392


   > @danny0405 @dawidwys
   > Any reasons all the fields read and written by this format has prefix 'record_' ? (I'm using flink sql for this client)
   > I found responsible code probably here but still have problem with this solution:
   > https://github.com/apache/flink/blob/de87a2debde8546e6741390a81f43c032521c3c0/flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java#L365
   
   It's because of the current strategy to infer the Avro schema is convert from the `CREATE TABLE` DDL, and there is no way to get the record name here. So we put a constant `record` as a prefix. The record write out all have explicit field name, but the type should be compatible.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458664909



##########
File path: flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java
##########
@@ -0,0 +1,183 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.api.common.serialization.DeserializationSchema;
+import org.apache.flink.api.common.serialization.SerializationSchema;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.formats.avro.AvroRowDataDeserializationSchema;
+import org.apache.flink.formats.avro.AvroRowDataSerializationSchema;
+import org.apache.flink.formats.avro.AvroToRowDataConverters;
+import org.apache.flink.formats.avro.RowDataToAvroConverters;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogTableImpl;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.connector.sink.DynamicTableSink;
+import org.apache.flink.table.connector.source.DynamicTableSource;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.factories.FactoryUtil;
+import org.apache.flink.table.factories.TestDynamicTableFactory;
+import org.apache.flink.table.runtime.connector.source.ScanRuntimeProviderContext;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Consumer;
+
+import static org.apache.flink.core.testutils.FlinkMatchers.containsCause;
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Tests for the {@link RegistryAvroFormatFactory}.
+ */
+public class RegistryAvroFormatFactoryTest {
+	private TableSchema schema;
+	private RowType rowType;
+	private String subject;
+	private String registryURL;
+
+	@Rule
+	public ExpectedException thrown = ExpectedException.none();
+
+	@Before
+	public void before() {

Review comment:
       We can, make them static also works.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] maver1ck edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
maver1ck edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-714693831


   @danny0405 @dawidwys 
   Any reasons all the fields read and written by this format has prefix 'record_' ? (I'm using flink sql for this)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657) 
   * edb952c0f8ae4394b7f5238f4fea39878106a775 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 6839c54eedcdca926b8304782fabcb0dc529c5a6 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847) 
   * 9d8870894b4d9d434c45b58339985aed3b76a8be Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458672440



##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/RegistryAvroRowDataSeDeSchemaTest.java
##########
@@ -0,0 +1,129 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro;
+
+import org.apache.flink.formats.avro.generated.Address;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.formats.avro.utils.TestDataGenerator;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.RowType;
+
+import org.junit.Before;
+import org.junit.Test;
+
+import java.util.Random;
+
+import static org.apache.flink.formats.avro.utils.AvroTestUtils.writeRecord;
+import static org.hamcrest.core.Is.is;
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertThat;
+
+/**
+ * Tests for {@link AvroRowDataDeserializationSchema} and
+ * {@link AvroRowDataSerializationSchema} for schema registry avro.
+ */
+public class RegistryAvroRowDataSeDeSchemaTest {

Review comment:
       Yes, we can remove it.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458724014



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroRowDataSerializationSchema.java
##########
@@ -64,61 +41,59 @@
 
 	private static final long serialVersionUID = 1L;
 
+	/** Nested schema to serialize the {@link GenericRecord} into bytes. **/
+	private final SerializationSchema<GenericRecord> nestedSchema;
+
 	/**
 	 * Logical type describing the input type.
 	 */
 	private final RowType rowType;
 
-	/**
-	 * Runtime instance that performs the actual work.
-	 */
-	private final SerializationRuntimeConverter runtimeConverter;
-
 	/**
 	 * Avro serialization schema.
 	 */
 	private transient Schema schema;
 
 	/**
-	 * Writer to serialize Avro record into a Avro bytes.
-	 */
-	private transient DatumWriter<IndexedRecord> datumWriter;
-
-	/**
-	 * Output stream to serialize records into byte array.
+	 * Runtime instance that performs the actual work.
 	 */
-	private transient ByteArrayOutputStream arrayOutputStream;
+	private final RowDataToAvroConverters.RowDataToAvroConverter runtimeConverter;
 
 	/**
-	 * Low-level class for serialization of Avro values.
+	 * Creates an Avro serialization schema with the given record row type.
 	 */
-	private transient Encoder encoder;
+	public AvroRowDataSerializationSchema(RowType rowType) {

Review comment:
       Why not, the avro row data format is default to SE/DE avro without schema registry, if other formats what to customize something, use the correct constructor. Write the same code everywhere just makes no sense.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657) 
   * edb952c0f8ae4394b7f5238f4fea39878106a775 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458724014



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroRowDataSerializationSchema.java
##########
@@ -64,61 +41,59 @@
 
 	private static final long serialVersionUID = 1L;
 
+	/** Nested schema to serialize the {@link GenericRecord} into bytes. **/
+	private final SerializationSchema<GenericRecord> nestedSchema;
+
 	/**
 	 * Logical type describing the input type.
 	 */
 	private final RowType rowType;
 
-	/**
-	 * Runtime instance that performs the actual work.
-	 */
-	private final SerializationRuntimeConverter runtimeConverter;
-
 	/**
 	 * Avro serialization schema.
 	 */
 	private transient Schema schema;
 
 	/**
-	 * Writer to serialize Avro record into a Avro bytes.
-	 */
-	private transient DatumWriter<IndexedRecord> datumWriter;
-
-	/**
-	 * Output stream to serialize records into byte array.
+	 * Runtime instance that performs the actual work.
 	 */
-	private transient ByteArrayOutputStream arrayOutputStream;
+	private final RowDataToAvroConverters.RowDataToAvroConverter runtimeConverter;
 
 	/**
-	 * Low-level class for serialization of Avro values.
+	 * Creates an Avro serialization schema with the given record row type.
 	 */
-	private transient Encoder encoder;
+	public AvroRowDataSerializationSchema(RowType rowType) {

Review comment:
       Why not, the avro row data format is default to SE/DE avro without schema, if other formats what to customize something, use the correct constructor. Write the same code everywhere just makes no sense.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-663488058


   Schema registry is a terminology and people always calls "schema registry url"[1], the same for "schema registry subject".
   [1] https://docs.confluent.io/current/schema-registry/index.html#high-availability-for-single-primary-setup


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863",
       "triggerID" : "810321d988a8284eb54c2963f22a049dc06ac8aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875",
       "triggerID" : "6bd8c02de778a8ff2f34a19d5beee414beac3f69",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 810321d988a8284eb54c2963f22a049dc06ac8aa Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4863) 
   * 6bd8c02de778a8ff2f34a19d5beee414beac3f69 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4875) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4762",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769",
       "triggerID" : "9b557c718fe731e8d5c58e7c5d9c3452a245ee5a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "71218ee49095663a641e56889831536a2a2e69ea",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818",
       "triggerID" : "71218ee49095663a641e56889831536a2a2e69ea",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4847",
       "triggerID" : "6839c54eedcdca926b8304782fabcb0dc529c5a6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851",
       "triggerID" : "9d8870894b4d9d434c45b58339985aed3b76a8be",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc53f692ab7edf110e6c7c39202e50ed1ec0c05d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bc53f692ab7edf110e6c7c39202e50ed1ec0c05d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 9b557c718fe731e8d5c58e7c5d9c3452a245ee5a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4769) 
   * 71218ee49095663a641e56889831536a2a2e69ea Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4818) 
   * 9d8870894b4d9d434c45b58339985aed3b76a8be Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4851) 
   * bc53f692ab7edf110e6c7c39202e50ed1ec0c05d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bac18583fd0ba4855eebd76409198e1fb3fc3314 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578) 
   * 87d74c99c293fba3090208d89f687be7dbe3f3ab Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715117170


   I have fired a fix https://github.com/apache/flink/pull/13763/files, can you help check if possible @maver1ck :)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r458729825



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
##########
@@ -0,0 +1,76 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.formats.avro.SchemaCoder;
+
+import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
+
+import javax.annotation.Nullable;
+
+import java.util.Objects;
+
+/** A {@link SchemaCoder.SchemaCoderProvider} that uses a cached schema registry
+ * client underlying. **/
+@Internal
+class CachedSchemaCoderProvider implements SchemaCoder.SchemaCoderProvider {

Review comment:
       We only need one class for the serialize/deserialize schema.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-659928275


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4578",
       "triggerID" : "bac18583fd0ba4855eebd76409198e1fb3fc3314",
       "triggerType" : "PUSH"
     }, {
       "hash" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4622",
       "triggerID" : "87d74c99c293fba3090208d89f687be7dbe3f3ab",
       "triggerType" : "PUSH"
     }, {
       "hash" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4627",
       "triggerID" : "48d837f15c74d134f2ba8b8e8f7ea27e6b62299f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4637",
       "triggerID" : "d6ad9058acaa94a47ffcc342ca72cd1cbbd17ee0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4657",
       "triggerID" : "0829a1d9fbbdffccf0399ff0a0c4dc9a959c1b24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4667",
       "triggerID" : "edb952c0f8ae4394b7f5238f4fea39878106a775",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687",
       "triggerID" : "8984f0fbb7914ce69763e0a2b7afb869621bdd0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703",
       "triggerID" : "d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713",
       "triggerID" : "f0da3cee91d22ec20cbba1b6c5be45da1440cf05",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f006afeec4c8ee25dfe12b944e2cf4260239ca1e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730",
       "triggerID" : "9104e12b0394cd6d578d2380ca4554b75e6e00f9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "22ed53e6e047b379e0ee568298600afd9283b2b8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8984f0fbb7914ce69763e0a2b7afb869621bdd0d Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4687) 
   * d1e4ba7690be134e21d193fbc1cb01aa51aaeb9b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4703) 
   * f0da3cee91d22ec20cbba1b6c5be45da1440cf05 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4713) 
   * f006afeec4c8ee25dfe12b944e2cf4260239ca1e UNKNOWN
   * 9104e12b0394cd6d578d2380ca4554b75e6e00f9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=4730) 
   * 22ed53e6e047b379e0ee568298600afd9283b2b8 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] homepy commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
homepy commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r503783829



##########
File path: flink-formats/flink-avro-confluent-registry/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
##########
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+org.apache.flink.formats.avro.registry.confluent.RegistryAvroFormatFactory

Review comment:
       This file does not exist in the release jar (https://mvnrepository.com/artifact/org.apache.flink/flink-avro-confluent-registry/1.11.2), but some other things. 
   Maybe there is any mistake of the maven-shade-plugin config in pom.xml?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-716324174


   @maver1ck , you are right, we ignore the nullability of `TIMESTAMP_WITHOUT_TIME_ZONE`, `DATE`, and `TIME_WITHOUT_TIME_ZONE` and `Decimal`, would fix it altogether in this PR.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #12919: [FLINK-16048][avro] Support read/write confluent schema registry avro…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #12919:
URL: https://github.com/apache/flink/pull/12919#discussion_r460021794



##########
File path: flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroRowDataSeDeSchemaTest.java
##########
@@ -0,0 +1,199 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.formats.avro.registry.confluent;
+
+import org.apache.flink.formats.avro.AvroRowDataDeserializationSchema;
+import org.apache.flink.formats.avro.AvroRowDataSerializationSchema;
+import org.apache.flink.formats.avro.AvroToRowDataConverters;
+import org.apache.flink.formats.avro.RegistryAvroDeserializationSchema;
+import org.apache.flink.formats.avro.RegistryAvroSerializationSchema;
+import org.apache.flink.formats.avro.RowDataToAvroConverters;
+import org.apache.flink.formats.avro.generated.Address;
+import org.apache.flink.formats.avro.typeutils.AvroSchemaConverter;
+import org.apache.flink.formats.avro.utils.TestDataGenerator;
+import org.apache.flink.table.data.GenericRowData;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.data.binary.BinaryStringData;
+import org.apache.flink.table.runtime.typeutils.InternalTypeInfo;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.RowType;
+
+import io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient;
+import io.confluent.kafka.schemaregistry.client.SchemaRegistryClient;
+import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException;
+import org.apache.avro.Schema;
+import org.apache.avro.generic.GenericRecord;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.ExpectedException;
+
+import java.io.IOException;
+import java.util.Random;
+
+import static org.apache.flink.core.testutils.FlinkMatchers.containsCause;
+import static org.apache.flink.formats.avro.utils.AvroTestUtils.writeRecord;
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.core.Is.is;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertThat;
+
+/**
+ * Tests for {@link AvroRowDataDeserializationSchema} and
+ * {@link AvroRowDataSerializationSchema} for schema registry avro.
+ */
+public class RegistryAvroRowDataSeDeSchemaTest {
+	private static final Schema ADDRESS_SCHEMA = Address.getClassSchema();
+
+	private static final Schema ADDRESS_SCHEMA_COMPATIBLE = new Schema.Parser().parse(
+			"" +
+					"{\"namespace\": \"org.apache.flink.formats.avro.generated\",\n" +
+					" \"type\": \"record\",\n" +
+					" \"name\": \"Address\",\n" +
+					" \"fields\": [\n" +
+					"     {\"name\": \"num\", \"type\": \"int\"},\n" +
+					"     {\"name\": \"street\", \"type\": \"string\"}\n" +
+					"  ]\n" +
+					"}");
+
+	private static final String SUBJECT = "address-value";
+
+	private static SchemaRegistryClient client;
+
+	private Address address;
+
+	@Rule
+	public ExpectedException expectedEx = ExpectedException.none();
+
+	@BeforeClass
+	public static void beforeClass() {
+		client = new MockSchemaRegistryClient();
+	}
+
+	@Before
+	public void before() {
+		this.address = TestDataGenerator.generateRandomAddress(new Random());
+	}
+
+	@After
+	public void after() throws IOException, RestClientException {
+		client.deleteSubject(SUBJECT);
+	}
+
+	@Test
+	public void testRowDataWriteReadWithFullSchema() throws Exception {
+		testRowDataWriteReadWithSchema(ADDRESS_SCHEMA);
+	}
+
+	@Test
+	public void testRowDataWriteReadWithCompatibleSchema() throws Exception {
+		testRowDataWriteReadWithSchema(ADDRESS_SCHEMA_COMPATIBLE);
+		// Validates new schema has been registered.
+		assertThat(client.getAllVersions("address-value").size(), is(1));

Review comment:
       nit: use `SUBJECT`

##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/RegistryAvroDeserializationSchema.java
##########
@@ -52,13 +53,22 @@
 	 * @param schemaCoderProvider schema provider that allows instantiation of {@link SchemaCoder} that will be used for
 	 *                            schema reading
 	 */
-	protected RegistryAvroDeserializationSchema(Class<T> recordClazz, @Nullable Schema reader,
+	public RegistryAvroDeserializationSchema(Class<T> recordClazz, @Nullable Schema reader,
 			SchemaCoder.SchemaCoderProvider schemaCoderProvider) {
 		super(recordClazz, reader);
 		this.schemaCoderProvider = schemaCoderProvider;
 		this.schemaCoder = schemaCoderProvider.get();
 	}
 
+	public static RegistryAvroDeserializationSchema<GenericRecord> forGeneric(

Review comment:
       Remove this method? If you prefer to keep it add missing javadoc.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org