You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2019/07/01 18:50:38 UTC

[GitHub] [flink] bowenli86 commented on a change in pull request #8935: [Flink-12973]Support reading Hive complex types like array and map

bowenli86 commented on a change in pull request #8935: [Flink-12973]Support reading Hive complex types like array and map
URL: https://github.com/apache/flink/pull/8935#discussion_r299172620
 
 

 ##########
 File path: flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/batch/connectors/hive/HiveInputFormatTest.java
 ##########
 @@ -122,4 +124,55 @@ public void testReadFromHiveInputFormat() throws Exception {
 		Assert.assertEquals("3,3,a,3000,3.33", rows.get(2).toString());
 		Assert.assertEquals("4,4,a,4000,4.44", rows.get(3).toString());
 	}
+
+	@Test
+	public void testReadComplextDataTypeFromHiveInputFormat() throws Exception {
+		final String dbName = "default";
+		final String tblName = "complext_test";
+
+		TableSchema.Builder builder = new TableSchema.Builder();
+		builder.fields(new String[]{"a", "m", "s"}, new DataType[]{
+				DataTypes.ARRAY(DataTypes.INT()),
+				DataTypes.MAP(DataTypes.INT(), DataTypes.STRING()),
+				DataTypes.ROW(DataTypes.FIELD("f1", DataTypes.INT()), DataTypes.FIELD("f2", DataTypes.STRING()))});
+
+		//Now we used metaStore client to create hive table instead of using hiveCatalog for it doesn't support set
+		//serDe temporarily.
 
 Review comment:
   can you open a ticket for this work?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services