You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2020/10/23 07:56:41 UTC

[GitHub] [flink] danny0405 opened a new pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

danny0405 opened a new pull request #13763:
URL: https://github.com/apache/flink/pull/13763


   …t Avro format deserialization
   
   ## What is the purpose of the change
   
   Add prefix for the field name only when the record and field have the same name.
   
   
   ## Brief change log
   
     - Modify `AvroSchemaConverter.convertToSchema` to only append field prefix when the record and field have the same name
     - Modify the existing test
   
   
   ## Verifying this change
   
   Added UT.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): no
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: no
     - The serializers: no
     - The runtime per-record code paths (performance sensitive): no
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: no
     - The S3 file system connector: no
   
   ## Documentation
   
     - Does this pull request introduce a new feature? no
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512618142



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       It is meaningless to have a nullable top row type, and in the `CREATE TABLE` DDL there is no way to specify that the table row is not nullable, actually it should always be not null(even if all the fields are null).




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-718589957


   > I think there is still a test failling: `org.apache.flink.table.client.gateway.local.LocalExecutorITCase#testStreamQueryExecutionSink`
   
   Yes, sorry for that i forget to commit some files, re-commit and run the testing ~


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 6ef138838f1a4a0b45a42eff39b75847c9b39767 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b92a32d37a8eaa446f24f44e467e588cdabcd9f5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] wuchong commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512632436



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       I have the same feeling with @dawidwys . 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 13712c13da124c38a430d4cc50ad28aec4318764 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512628613



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       I agree that the top row should not be nullable. It is a property of a SQL Table though, not of the format. If I am not wrong there you can not specify a ROW nullable in `CREATE TABLE` because it is always `NOT NULL`. Therefore it is meaningless to have additional parameter in the `convertToSchema`.
   
   What I am saying is that the format should not be bother with this. It is purely a matter of the planner.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512618789



##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverterTest.java
##########
@@ -104,48 +104,115 @@ public void testRowTypeAvroSchemaConversion() {
 				DataTypes.FIELD("row3", DataTypes.ROW(DataTypes.FIELD("c", DataTypes.STRING())))))
 			.build().toRowDataType().getLogicalType();
 		Schema schema = AvroSchemaConverter.convertToSchema(rowType);
-		assertEquals("{\n" +
+		assertEquals("[ {\n" +

Review comment:
       Thanks, i would add `schema` -> `DataType` -> `schema` test.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ba752274c9926115f65f5ecbef55b71b0b71cfa2 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440) 
   * f004220668e20dcd9860026b69566868d473db33 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     }, {
       "hash" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584",
       "triggerID" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8591",
       "triggerID" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 170355bcb112d902e78ee4fc1644b3e6aee5eb29 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584) 
   * e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8591) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a28669bca9c36dac74f89da177825d73bea0ece0 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408) 
   * ba752274c9926115f65f5ecbef55b71b0b71cfa2 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440) 
   * f004220668e20dcd9860026b69566868d473db33 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b92a32d37a8eaa446f24f44e467e588cdabcd9f5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b92a32d37a8eaa446f24f44e467e588cdabcd9f5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513370899



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       I had offline discussion with @wuchong and @dawidwys , and after some research, we found that a non-nullable row type is more reasonable.
   
   But because the change is huge(many codes that convert a type info to data type assumes nullable true before), me and @wuchong decide to change the method signature to `convertToSchema(RowType schema)` and add a notion to the method doc that the passed in `schema` must be the top level record type.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 6ef138838f1a4a0b45a42eff39b75847c9b39767 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277) 
   * 11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512374210



##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverterTest.java
##########
@@ -104,48 +104,115 @@ public void testRowTypeAvroSchemaConversion() {
 				DataTypes.FIELD("row3", DataTypes.ROW(DataTypes.FIELD("c", DataTypes.STRING())))))
 			.build().toRowDataType().getLogicalType();
 		Schema schema = AvroSchemaConverter.convertToSchema(rowType);
-		assertEquals("{\n" +
+		assertEquals("[ {\n" +

Review comment:
       Already added, see `AvroSchemaConverterTest.testConversionIntegralityNullable`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     }, {
       "hash" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584",
       "triggerID" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 13712c13da124c38a430d4cc50ad28aec4318764 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556) 
   * 170355bcb112d902e78ee4fc1644b3e6aee5eb29 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] V1ncentzzZ commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
V1ncentzzZ commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r510774044



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -362,7 +369,11 @@ public static Schema convertToSchema(LogicalType logicalType, String rowName) {
 					.record(rowName)
 					.fields();
 				for (int i = 0; i < rowType.getFieldCount(); i++) {
-					String fieldName = rowName + "_" + fieldNames.get(i);
+					String fieldName = fieldNames.get(i);
+					if (rowName.equals(fieldName)) {
+						// Can not build schema when the record and field have the same name
+						fieldName = rowName + "_" + fieldName;
+					}

Review comment:
       I think there is still existed a bug. see: https://issues.apache.org/jira/browse/FLINK-19779?focusedCommentId=17219588&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17219588




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369) 
   * a28669bca9c36dac74f89da177825d73bea0ece0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513135789



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       > `In SQL the outer row is always NOT NULL`
   
   It is not true, a row (null, null) is nullable true. And i don't think it makes sense to change the planner behavior in general in order to fix a specific use case. 
   
   >// you can achieve the same with
   // Schema schema = convertToSchema(type.notNull(), "record") 
   
   I don't think we should let each invoker to decide whether to make the data type not null, because in current codebase, we should always do that, make the decision everyone is error-prone and hard to maintain.
   
   I have merge the `top` into a row type nullability switch, see `AvroSchemaConverter.convertToSchema(LogicalType logicalType)`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513162229



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,32 +300,53 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
+		// If it is parsing the root row type, switches from nullable true to false
+		// because a nullable row type is meaningless and would generate wrong schema.
+		if (logicalType.getTypeRoot() == LogicalTypeRoot.ROW

Review comment:
       Although the `AvroSchemaConverter` is a tool class, it is still used as a public API, so i'm inclined to keep the signature unchanged. Another reason is that only logical type is enough for the conversion.
   
   Have added more documents to the methods.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 6ef138838f1a4a0b45a42eff39b75847c9b39767 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277) 
   * 11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513252724



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       > It is not true, a row (null, null) is nullable true. And i don't think it makes sense to change the planner behavior in general in order to fix a specific use case.
   
   Excuse me, but I wholeheartedly disagree with your statement. null =/= (null, null). (null, null) is still `NOT NULL`. A whole row in a Table can not be null. Only particular columns can be null. Therefore the top level row of a Table is always `NOT NULL`.
   
   I am not suggesting changing planner behaviour for a particular use case. The planner should always produce `NOT NULL` type for a top level row of a Table. If it doesn't, it is a bug.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a28669bca9c36dac74f89da177825d73bea0ece0 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513135789



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       > `In SQL the outer row is always NOT NULL`
   It is not true, a row (null, null) is nullable true. And i don't think it makes sense to change the planner behavior in general in order to fix a specific use case. 
   
   >// you can achieve the same with
   >// Schema schema = convertToSchema(type.notNull(), "record") 
   
   I don't think we should let each invoker to decide whether to make the data type not null, because in current codebase, we should always do that, make the decision everyone is error-prone and hard to maintain.
   
   I have merge the `top` into a row type nullability switch, see `AvroSchemaConverter.convertToSchema(LogicalType logicalType)`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a28669bca9c36dac74f89da177825d73bea0ece0 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408) 
   * ba752274c9926115f65f5ecbef55b71b0b71cfa2 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     }, {
       "hash" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 13712c13da124c38a430d4cc50ad28aec4318764 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556) 
   * 170355bcb112d902e78ee4fc1644b3e6aee5eb29 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     }, {
       "hash" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584",
       "triggerID" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8591",
       "triggerID" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 13712c13da124c38a430d4cc50ad28aec4318764 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556) 
   * 170355bcb112d902e78ee4fc1644b3e6aee5eb29 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584) 
   * e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8591) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512673931



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       Sorry, but I don't understand your arguments.
   
   What do you mean by "inferring" the outer row nullability? There is no inference. In SQL the outer row is always `NOT NULL` or otherwise the row simply does not exist.
   
   In the current shape you have two arguments that contradict each other. Take this example:
   
   ```
   DataType type = DataTypes.ROW(
       DataTypes.FIELD("f0", DataTypes.ROW(
           DataTypes.FIELD("f1", DataTypes.BIGINT()
       ).nullable()
   ).nullable();
   
   Schema schema = convertToSchema(type, "record", true) // <- you're overriding the property of type
   // you can achieve the same with
   // Schema schema = convertToSchema(type.notNull(), "record")
   ```
   
   It's the Planners task to produce a `notNull` type for a Tables type.
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] wuchong commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513140215



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,32 +300,53 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
+		// If it is parsing the root row type, switches from nullable true to false
+		// because a nullable row type is meaningless and would generate wrong schema.
+		if (logicalType.getTypeRoot() == LogicalTypeRoot.ROW

Review comment:
       I think we can make the parameter to be `RowType`, that would make sense to use it as the top-level row type and not generate nullable for it. Besides, would be better to add comments in the Javadoc. Currently, this method has the same Javadoc with `convertToSchema(LogicalType logicalType, String rowName)`. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b92a32d37a8eaa446f24f44e467e588cdabcd9f5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166) 
   * 6ef138838f1a4a0b45a42eff39b75847c9b39767 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r511842204



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -417,4 +431,11 @@ public static LogicalType extractValueTypeToAvroMap(LogicalType type) {
 		}
 		return builder;
 	}
+
+	/** Returns schema with nullable true. */
+	private static Schema nullableSchema(Schema schema) {

Review comment:
       Could we stick to a single way of declaring `Schema` nullable? With this PR we have two methods for the same purpose:
   * `nullableSchema`
   * `getNullableBuilder`
   
   Either use the `nullableSchema` everywhere or use `getNullableBuilder(...).type(...)`.

##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverterTest.java
##########
@@ -104,48 +104,115 @@ public void testRowTypeAvroSchemaConversion() {
 				DataTypes.FIELD("row3", DataTypes.ROW(DataTypes.FIELD("c", DataTypes.STRING())))))
 			.build().toRowDataType().getLogicalType();
 		Schema schema = AvroSchemaConverter.convertToSchema(rowType);
-		assertEquals("{\n" +
+		assertEquals("[ {\n" +

Review comment:
       I think it would be nice to add a test that we can convert back and forth between `DataType` and `Schema` in respect to the field names. 

##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -362,7 +369,11 @@ public static Schema convertToSchema(LogicalType logicalType, String rowName) {
 					.record(rowName)
 					.fields();
 				for (int i = 0; i < rowType.getFieldCount(); i++) {
-					String fieldName = rowName + "_" + fieldNames.get(i);
+					String fieldName = fieldNames.get(i);
+					if (rowName.equals(fieldName)) {
+						// Can not build schema when the record and field have the same name
+						fieldName = rowName + "_" + fieldName;
+					}

Review comment:
       I think the correct solution will be:
   
   ```
   				RowType rowType = (RowType) logicalType;
   				List<String> fieldNames = rowType.getFieldNames();
   				// we have to make sure the record name is different in a Schema
   				SchemaBuilder.FieldAssembler<Schema> builder =
   						getNullableBuilder(logicalType)
   								.record(rowName)
   								.fields();
   				for (int i = 0; i < rowType.getFieldCount(); i++) {
   					String fieldName = fieldNames.get(i);
   					builder = builder
   						.name(fieldName)
   						.type(convertToSchema(rowType.getTypeAt(i), rowName + "_" + fieldName))
   						.noDefault();
   				}
   				return builder.endRecord();
   ```

##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -362,7 +369,11 @@ public static Schema convertToSchema(LogicalType logicalType, String rowName) {
 					.record(rowName)
 					.fields();
 				for (int i = 0; i < rowType.getFieldCount(); i++) {
-					String fieldName = rowName + "_" + fieldNames.get(i);
+					String fieldName = fieldNames.get(i);
+					if (rowName.equals(fieldName)) {
+						// Can not build schema when the record and field have the same name
+						fieldName = rowName + "_" + fieldName;
+					}

Review comment:
       That is not correct. The builder does support same names for fields in different nested levels.
   
   Avro in general does not support same record types with different schemas. And it does it rightly so. Therefore a schema like:
   
   ```
   {
       "type": "record", 
       "name": "top", 
       "fields": [ 
   		  {
                "name": "top", 
                "type": { 
                    "type": "record", 
                    "name": "nested", 
                    "fields": [ 
                        {"type": "string", "name": "top"} 
                    ]
                }
             }
       ] 
   }
   ```
   is valid and supported. However if we change the name of the `nested` record to `top` it will be invalid:
   ```
   {
       "type": "record", 
       "name": "top", 
       "fields": [ 
   		  {
                "name": "top", 
                "type": { 
                    "type": "record", 
                    "name": "top", 
                    "fields": [ 
                        {"type": "string", "name": "top"} 
                    ]
                }
             }
       ] 
   }
   ```
   
   I think the core problem lays in how the `rowName` is generated. I think we should never adjust the `fieldName`, but we should append the `fieldName` to the `rowName`.
   
   BTW another shortcoming that I see is that we are losing the record name when converting from `Schema` to `DataType`. I think it is not a real issue though.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] wuchong commented on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
wuchong commented on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715241507


   Let's hold this PR a bit and continue discussion in FLINK-19779.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 4c5a06b1ca2833fe7f63c25503660ed7acf9b77d Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503) 
   * 13712c13da124c38a430d4cc50ad28aec4318764 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512616616



##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroRowDataDeSerializationSchemaTest.java
##########
@@ -181,10 +181,11 @@ public void testSpecificType() throws Exception {
 		encoder.flush();
 		byte[] input = byteArrayOutputStream.toByteArray();
 
+		// SE/DE SpecificRecord using the GenericRecord way only supports non-nullable data type.

Review comment:
       Thanks, would remove it.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369) 
   * a28669bca9c36dac74f89da177825d73bea0ece0 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] wuchong commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
wuchong commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513140215



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,32 +300,53 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
+		// If it is parsing the root row type, switches from nullable true to false
+		// because a nullable row type is meaningless and would generate wrong schema.
+		if (logicalType.getTypeRoot() == LogicalTypeRoot.ROW

Review comment:
       I think we can make the parameter to be `RowType`, that would be make sense to use it as the top-level row type and not generate nullable for it. Besides, would be better to add comments in the Javadoc. Currently, this method has the same Javadoc with `convertToSchema(LogicalType logicalType, String rowName)`. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a28669bca9c36dac74f89da177825d73bea0ece0 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408) 
   * ba752274c9926115f65f5ecbef55b71b0b71cfa2 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * f004220668e20dcd9860026b69566868d473db33 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     }, {
       "hash" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584",
       "triggerID" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8591",
       "triggerID" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8591) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b92a32d37a8eaa446f24f44e467e588cdabcd9f5 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166) 
   * 6ef138838f1a4a0b45a42eff39b75847c9b39767 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512596226



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       Is it necessary? Shouldn't the `top` be handled in the planner and just passed with a correct setting in the `logicalType`? I think it is a bit too deep to fix it here.

##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/AvroRowDataDeSerializationSchemaTest.java
##########
@@ -181,10 +181,11 @@ public void testSpecificType() throws Exception {
 		encoder.flush();
 		byte[] input = byteArrayOutputStream.toByteArray();
 
+		// SE/DE SpecificRecord using the GenericRecord way only supports non-nullable data type.

Review comment:
       That comment is misleading. There is no limitation in the actual handling of nulls with SpecificRecord/GenericRecord.
   It's just that the `LogicalTimeRecord` has those fields declared as `notNull`. The previous `dataType` was just wrong and did not describe the `LogicalTimeRecord` correctly.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-717026626


   Thanks @dawidwys for the review, i have addressed your comments.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513264987



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       >Excuse me, but I wholeheartedly disagree with your statement. null =/= (null, null). (null, null) is still NOT NULL. A whole row in a Table can not be null. Only particular columns can be null. Therefore the top level row of a Table is always NOT NULL.
   
   You can test it in PostgreSQL with the following SQL:
   ```sql
   create type my_type as (a int, b varchar(20));
   
   create table t1(
     f0 my_type,
     f1 varchar(20)
   );
   
   insert into t1 values((null, null), 'def');
   
   select f0 is null from t1; -- it returns true
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a28669bca9c36dac74f89da177825d73bea0ece0 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408) 
   * ba752274c9926115f65f5ecbef55b71b0b71cfa2 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440) 
   * f004220668e20dcd9860026b69566868d473db33 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512367627



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -417,4 +431,11 @@ public static LogicalType extractValueTypeToAvroMap(LogicalType type) {
 		}
 		return builder;
 	}
+
+	/** Returns schema with nullable true. */
+	private static Schema nullableSchema(Schema schema) {

Review comment:
       They are for different purpose but i think we can use `nullableSchema` altogether.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r511757745



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -362,7 +369,11 @@ public static Schema convertToSchema(LogicalType logicalType, String rowName) {
 					.record(rowName)
 					.fields();
 				for (int i = 0; i < rowType.getFieldCount(); i++) {
-					String fieldName = rowName + "_" + fieldNames.get(i);
+					String fieldName = fieldNames.get(i);
+					if (rowName.equals(fieldName)) {
+						// Can not build schema when the record and field have the same name
+						fieldName = rowName + "_" + fieldName;
+					}

Review comment:
       Yes, the avro schema builder does not allow same name field names, even if they are in different scope (different layer).




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-718568993


   I think there is still a test failling: `org.apache.flink.table.client.gateway.local.LocalExecutorITCase#testStreamQueryExecutionSink`


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 4c5a06b1ca2833fe7f63c25503660ed7acf9b77d Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503) 
   * 13712c13da124c38a430d4cc50ad28aec4318764 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 4c5a06b1ca2833fe7f63c25503660ed7acf9b77d Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * f004220668e20dcd9860026b69566868d473db33 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455) 
   * 4c5a06b1ca2833fe7f63c25503660ed7acf9b77d Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513927391



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       @dawidwys  I have changed the schema row type to be always nullable false, please take a look again if you have time, thanks so much.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * f004220668e20dcd9860026b69566868d473db33 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455) 
   * 4c5a06b1ca2833fe7f63c25503660ed7acf9b77d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys merged pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys merged pull request #13763:
URL: https://github.com/apache/flink/pull/13763


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513253682



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       > I don't think we should let each invoker to decide whether to make the data type not null, because in current codebase, we should always do that, make the decision everyone is error-prone and hard to maintain.
   
   I agree making the same decision over and over again at multiple location is error prone and hard to maintain and that's what I want to avoid.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512640835



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       Currently, only avro format cares about the outer row nullability, we can switch to change the planner if we found more user cases.
   
   Technically to say, it is impossible to infer the outer row nullability only from the DDL.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715120169


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit b92a32d37a8eaa446f24f44e467e588cdabcd9f5 (Fri Oct 23 07:59:21 UTC 2020)
   
   **Warnings:**
    * No documentation files were touched! Remember to keep the Flink docs up to date!
    * **This pull request references an unassigned [Jira ticket](https://issues.apache.org/jira/browse/FLINK-19779).** According to the [code contribution guide](https://flink.apache.org/contributing/contribute-code.html), tickets need to be assigned before starting with the implementation work.
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] danny0405 commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
danny0405 commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513370899



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       I had offline discussion with @wuchong and @dawidwys , and after some research, we found that a non-nullable row type is more reasonable.
   
   But because the change is huge(many codes that convert a type info to data type assumes nullable true before), me and @wuchong decide to change the method signature to `convertToSchema(RowType schema)` and add a notion to the method doc that the passed in `schema` must be the top level record type.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r513269624



##########
File path: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverter.java
##########
@@ -297,17 +299,22 @@ private static DataType convertToDataType(Schema schema) {
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
 	public static Schema convertToSchema(LogicalType logicalType) {
-		return convertToSchema(logicalType, "record");
+		return convertToSchema(logicalType, "record", true);
 	}
 
 	/**
 	 * Converts Flink SQL {@link LogicalType} (can be nested) into an Avro schema.
 	 *
 	 * @param logicalType logical type
 	 * @param rowName     the record name
+	 * @param top         whether it is parsing the root record,
+	 *                    if it is, the logical type nullability would be ignored
 	 * @return Avro's {@link Schema} matching this logical type.
 	 */
-	public static Schema convertToSchema(LogicalType logicalType, String rowName) {
+	public static Schema convertToSchema(
+			LogicalType logicalType,
+			String rowName,
+			boolean top) {

Review comment:
       But the `my_type` is not a top level row in your example. It is a type of a column. It has nothing to do with the case we're discussing.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13763:
URL: https://github.com/apache/flink/pull/13763#issuecomment-715195599


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8166",
       "triggerID" : "b92a32d37a8eaa446f24f44e467e588cdabcd9f5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8277",
       "triggerID" : "6ef138838f1a4a0b45a42eff39b75847c9b39767",
       "triggerType" : "PUSH"
     }, {
       "hash" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8369",
       "triggerID" : "11a1af7e69935ec9bbbcc4aeea5dcb5a8b0c68ce",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8408",
       "triggerID" : "a28669bca9c36dac74f89da177825d73bea0ece0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8440",
       "triggerID" : "ba752274c9926115f65f5ecbef55b71b0b71cfa2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "f004220668e20dcd9860026b69566868d473db33",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8455",
       "triggerID" : "f004220668e20dcd9860026b69566868d473db33",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8503",
       "triggerID" : "4c5a06b1ca2833fe7f63c25503660ed7acf9b77d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556",
       "triggerID" : "13712c13da124c38a430d4cc50ad28aec4318764",
       "triggerType" : "PUSH"
     }, {
       "hash" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584",
       "triggerID" : "170355bcb112d902e78ee4fc1644b3e6aee5eb29",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 13712c13da124c38a430d4cc50ad28aec4318764 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8556) 
   * 170355bcb112d902e78ee4fc1644b3e6aee5eb29 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=8584) 
   * e0a53361318b9ee5ceae9a1d6bc50f77a2d28c8a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dawidwys commented on a change in pull request #13763: [FLINK-19779][avro] Remove the record_ field name prefix for Confluen…

Posted by GitBox <gi...@apache.org>.
dawidwys commented on a change in pull request #13763:
URL: https://github.com/apache/flink/pull/13763#discussion_r512586324



##########
File path: flink-formats/flink-avro/src/test/java/org/apache/flink/formats/avro/typeutils/AvroSchemaConverterTest.java
##########
@@ -104,48 +104,115 @@ public void testRowTypeAvroSchemaConversion() {
 				DataTypes.FIELD("row3", DataTypes.ROW(DataTypes.FIELD("c", DataTypes.STRING())))))
 			.build().toRowDataType().getLogicalType();
 		Schema schema = AvroSchemaConverter.convertToSchema(rowType);
-		assertEquals("{\n" +
+		assertEquals("[ {\n" +

Review comment:
       True, you test though only the `DataType` -> `Schema` and back. We could also add a test for `Schema` -> `DataType` (you start with a Schema), which is important in case of a schema registry.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org