You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/04/30 01:50:02 UTC

[GitHub] [incubator-hudi] hddong opened a new pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

hddong opened a new pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574


   ## *Tips*
   - *Thank you very much for contributing to Apache Hudi.*
   - *Please review https://hudi.apache.org/contributing.html before opening a pull request.*
   
   ## What is the purpose of the pull request
   
   *Add unit test for HDFSParquetImportCommand*
   
   ## Brief change log
   
     - *Modify AnnotationLocation checkstyle rule in checkstyle.xml*
   
   ## Verify this pull request
   
   This pull request is a trivial rework / code cleanup without any test coverage.
   
   ## Committer checklist
   
    - [ ] Has a corresponding JIRA in PR title & commit
    
    - [ ] Commit message is descriptive of the change
    
    - [ ] CI is green
   
    - [ ] Necessary doc changes done or have another open PR
          
    - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] codecov-io edited a comment on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-621599057


   # [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=h1) Report
   > Merging [#1574](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=desc) into [master](https://codecov.io/gh/apache/incubator-hudi/commit/d54b4b8a525868ea6d15e2e2cc6ffccc62d5c43c&el=desc) will **increase** coverage by `0.02%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-hudi/pull/1574/graphs/tree.svg?width=650&height=150&src=pr&token=VTTXabwbs2)](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree)
   
   ```diff
   @@             Coverage Diff              @@
   ##             master    #1574      +/-   ##
   ============================================
   + Coverage     71.77%   71.79%   +0.02%     
     Complexity     1087     1087              
   ============================================
     Files           385      385              
     Lines         16575    16575              
     Branches       1668     1668              
   ============================================
   + Hits          11896    11900       +4     
   + Misses         3951     3946       -5     
   - Partials        728      729       +1     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree) | Coverage Δ | Complexity Δ | |
   |---|---|---|---|
   | [...src/main/java/org/apache/hudi/metrics/Metrics.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jbGllbnQvc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvbWV0cmljcy9NZXRyaWNzLmphdmE=) | `67.56% <0.00%> (+10.81%)` | `0.00% <0.00%> (ø%)` | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=footer). Last update [d54b4b8...25480f3](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] yanghua edited a comment on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
yanghua edited a comment on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-625312245


   @hddong  LGTM and will merge it after you fix the conflicting file.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] pratyakshsharma commented on a change in pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
pratyakshsharma commented on a change in pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#discussion_r421735803



##########
File path: hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java
##########
@@ -82,17 +82,17 @@ public static void main(String[] args) throws Exception {
         break;
       case IMPORT:
       case UPSERT:
-        assert (args.length >= 12);
+        assert (args.length >= 13);
         String propsFilePath = null;
-        if (!StringUtils.isNullOrEmpty(args[11])) {
-          propsFilePath = args[11];
+        if (!StringUtils.isNullOrEmpty(args[12])) {
+          propsFilePath = args[12];
         }
         List<String> configs = new ArrayList<>();
-        if (args.length > 12) {
-          configs.addAll(Arrays.asList(args).subList(12, args.length));
+        if (args.length > 13) {
+          configs.addAll(Arrays.asList(args).subList(13, args.length));
         }
-        returnCode = dataLoad(jsc, command, args[1], args[2], args[3], args[4], args[5], args[6],
-            Integer.parseInt(args[7]), args[8], args[9], Integer.parseInt(args[10]), propsFilePath, configs);
+        returnCode = dataLoad(jsc, command, args[3], args[4], args[5], args[6], args[7], args[8],

Review comment:
       @yanghua Its been there for some time. Development was done, I was stuck at one of the test cases last when I worked on it. Will take a look at it soon :)  




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] yanghua commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
yanghua commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-628527261


   @hddong There is another conflicting file. Please fix it.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] codecov-io commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
codecov-io commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-621599057


   # [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=h1) Report
   > Merging [#1574](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=desc) into [master](https://codecov.io/gh/apache/incubator-hudi/commit/9059bce977cee98e2d65769622c46a1941c563dd&el=desc) will **increase** coverage by `0.04%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-hudi/pull/1574/graphs/tree.svg?width=650&height=150&src=pr&token=VTTXabwbs2)](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree)
   
   ```diff
   @@             Coverage Diff              @@
   ##             master    #1574      +/-   ##
   ============================================
   + Coverage     71.81%   71.85%   +0.04%     
     Complexity      294      294              
   ============================================
     Files           385      385              
     Lines         16540    16540              
     Branches       1661     1661              
   ============================================
   + Hits          11878    11885       +7     
   + Misses         3932     3925       -7     
     Partials        730      730              
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree) | Coverage Δ | Complexity Δ | |
   |---|---|---|---|
   | [...n/java/org/apache/hudi/common/model/HoodieKey.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jb21tb24vc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvY29tbW9uL21vZGVsL0hvb2RpZUtleS5qYXZh) | `94.44% <0.00%> (+5.55%)` | `0.00% <0.00%> (ø%)` | |
   | [...src/main/java/org/apache/hudi/metrics/Metrics.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jbGllbnQvc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvbWV0cmljcy9NZXRyaWNzLmphdmE=) | `67.56% <0.00%> (+10.81%)` | `0.00% <0.00%> (ø%)` | |
   | [...g/apache/hudi/metrics/InMemoryMetricsReporter.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jbGllbnQvc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvbWV0cmljcy9Jbk1lbW9yeU1ldHJpY3NSZXBvcnRlci5qYXZh) | `80.00% <0.00%> (+40.00%)` | `0.00% <0.00%> (ø%)` | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=footer). Last update [9059bce...f22852b](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] hddong commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
hddong commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-626263014


   @yanghua : It's OK now.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] yanghua merged pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
yanghua merged pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] codecov-io edited a comment on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-621599057


   # [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=h1) Report
   > Merging [#1574](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=desc) into [master](https://codecov.io/gh/apache/incubator-hudi/commit/83796b3189570182c68a9c41e57b356124c301ca&el=desc) will **decrease** coverage by `0.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-hudi/pull/1574/graphs/tree.svg?width=650&height=150&src=pr&token=VTTXabwbs2)](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree)
   
   ```diff
   @@             Coverage Diff              @@
   ##             master    #1574      +/-   ##
   ============================================
   - Coverage     71.80%   71.78%   -0.02%     
   - Complexity     1087     1088       +1     
   ============================================
     Files           385      385              
     Lines         16591    16591              
     Branches       1669     1669              
   ============================================
   - Hits          11913    11910       -3     
   - Misses         3949     3952       +3     
     Partials        729      729              
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree) | Coverage Δ | Complexity Δ | |
   |---|---|---|---|
   | [...ache/hudi/common/fs/inline/InMemoryFileSystem.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jb21tb24vc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvY29tbW9uL2ZzL2lubGluZS9Jbk1lbW9yeUZpbGVTeXN0ZW0uamF2YQ==) | `79.31% <0.00%> (-10.35%)` | `0.00% <0.00%> (ø%)` | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=footer). Last update [83796b3...d825a37](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] yanghua commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
yanghua commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-625312245


   LGTM and will merge it after you fix the conflicting file.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] garyli1019 commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
garyli1019 commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-632830706


   Thanks @hddong . I am using Spark installed by brew and `mkdir /tmp/spark-events/` fix the issue. 
   IMO if docker is not possible then we may consider using a bootstrap script set up the environment for testing. Something like https://github.com/apache/impala/blob/master/bin/impala-config.sh. This script will be executed before every build.
   The bad side is that it will impact the user's local environment variable. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] yanghua commented on a change in pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
yanghua commented on a change in pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#discussion_r421393879



##########
File path: hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java
##########
@@ -82,17 +82,17 @@ public static void main(String[] args) throws Exception {
         break;
       case IMPORT:
       case UPSERT:
-        assert (args.length >= 12);
+        assert (args.length >= 13);
         String propsFilePath = null;
-        if (!StringUtils.isNullOrEmpty(args[11])) {
-          propsFilePath = args[11];
+        if (!StringUtils.isNullOrEmpty(args[12])) {
+          propsFilePath = args[12];
         }
         List<String> configs = new ArrayList<>();
-        if (args.length > 12) {
-          configs.addAll(Arrays.asList(args).subList(12, args.length));
+        if (args.length > 13) {
+          configs.addAll(Arrays.asList(args).subList(13, args.length));
         }
-        returnCode = dataLoad(jsc, command, args[1], args[2], args[3], args[4], args[5], args[6],
-            Integer.parseInt(args[7]), args[8], args[9], Integer.parseInt(args[10]), propsFilePath, configs);
+        returnCode = dataLoad(jsc, command, args[3], args[4], args[5], args[6], args[7], args[8],

Review comment:
       Thanks for reminding me. It seems that PR is inactive?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] hddong commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
hddong commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-621601455


   @yanghua : It's green now, please have a review when free.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] hddong commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
hddong commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-632434767


   @garyli1019 : I had try docker before, it usually use `execStartCmd` to exec cmd directly.
   But for hudi-cli, we need exec cmd in interactive mode. I will try it again later, but need some time. If any solution or suggestion, please let me know.
   Back to the failed test, It's due to spark job failed, I suggest you have a look of detail log throwed by spark job which above the assert log.
   There are two things necessary conditions : 1、 runnable spark in local env 2、SPARK_HOME
   additionnal notice:  if spark ues default config, use command `mkdir /tmp/spark-events/` to create a tmp directory and do not use brew installation of spark.
   
   If there's still failed, please let me know. Here or email to me.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] hddong commented on a change in pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
hddong commented on a change in pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#discussion_r421365870



##########
File path: hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java
##########
@@ -82,17 +82,17 @@ public static void main(String[] args) throws Exception {
         break;
       case IMPORT:
       case UPSERT:
-        assert (args.length >= 12);
+        assert (args.length >= 13);
         String propsFilePath = null;
-        if (!StringUtils.isNullOrEmpty(args[11])) {
-          propsFilePath = args[11];
+        if (!StringUtils.isNullOrEmpty(args[12])) {
+          propsFilePath = args[12];
         }
         List<String> configs = new ArrayList<>();
-        if (args.length > 12) {
-          configs.addAll(Arrays.asList(args).subList(12, args.length));
+        if (args.length > 13) {
+          configs.addAll(Arrays.asList(args).subList(13, args.length));
         }
-        returnCode = dataLoad(jsc, command, args[1], args[2], args[3], args[4], args[5], args[6],
-            Integer.parseInt(args[7]), args[8], args[9], Integer.parseInt(args[10]), propsFilePath, configs);
+        returnCode = dataLoad(jsc, command, args[3], args[4], args[5], args[6], args[7], args[8],

Review comment:
       > **Please note that I am not talking about you here**. But these array indexes containing numbers are very ugly and unreadable. We should think of a way to improve it.
   > 
   > We should parse all parameters and it is best to do:
   > 
   > 1. Define appropriate variables to store each parameter to improve the readability of the code;
   > 2. Refactor it, yes, the parse of the parameters should be order-independent;
   > 
   > WDYT? @hddong @vinothchandar
   
   Agree, and have a exist PR(#1174) by @pratyakshsharma




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] yanghua commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
yanghua commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-621580694


   @hddong Travis is red. Pls check it.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] codecov-io edited a comment on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-621599057


   # [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=h1) Report
   > Merging [#1574](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=desc) into [master](https://codecov.io/gh/apache/incubator-hudi/commit/9059bce977cee98e2d65769622c46a1941c563dd&el=desc) will **increase** coverage by `0.04%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/incubator-hudi/pull/1574/graphs/tree.svg?width=650&height=150&src=pr&token=VTTXabwbs2)](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree)
   
   ```diff
   @@             Coverage Diff              @@
   ##             master    #1574      +/-   ##
   ============================================
   + Coverage     71.81%   71.85%   +0.04%     
     Complexity      294      294              
   ============================================
     Files           385      385              
     Lines         16540    16540              
     Branches       1661     1661              
   ============================================
   + Hits          11878    11885       +7     
   + Misses         3932     3925       -7     
     Partials        730      730              
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=tree) | Coverage Δ | Complexity Δ | |
   |---|---|---|---|
   | [...n/java/org/apache/hudi/common/model/HoodieKey.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jb21tb24vc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvY29tbW9uL21vZGVsL0hvb2RpZUtleS5qYXZh) | `94.44% <0.00%> (+5.55%)` | `0.00% <0.00%> (ø%)` | |
   | [...src/main/java/org/apache/hudi/metrics/Metrics.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jbGllbnQvc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvbWV0cmljcy9NZXRyaWNzLmphdmE=) | `67.56% <0.00%> (+10.81%)` | `0.00% <0.00%> (ø%)` | |
   | [...g/apache/hudi/metrics/InMemoryMetricsReporter.java](https://codecov.io/gh/apache/incubator-hudi/pull/1574/diff?src=pr&el=tree#diff-aHVkaS1jbGllbnQvc3JjL21haW4vamF2YS9vcmcvYXBhY2hlL2h1ZGkvbWV0cmljcy9Jbk1lbW9yeU1ldHJpY3NSZXBvcnRlci5qYXZh) | `80.00% <0.00%> (+40.00%)` | `0.00% <0.00%> (ø%)` | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=footer). Last update [9059bce...f22852b](https://codecov.io/gh/apache/incubator-hudi/pull/1574?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] yanghua commented on a change in pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
yanghua commented on a change in pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#discussion_r420772541



##########
File path: hudi-cli/src/test/java/org/apache/hudi/cli/integ/ITTestHDFSParquetImportCommand.java
##########
@@ -0,0 +1,184 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *      http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hudi.cli.integ;
+
+import org.apache.avro.generic.GenericRecord;
+import org.apache.hadoop.fs.FSDataOutputStream;
+import org.apache.hadoop.fs.Path;
+import org.apache.hudi.cli.AbstractShellIntegrationTest;
+import org.apache.hudi.cli.HoodieCLI;
+import org.apache.hudi.cli.commands.TableCommand;
+import org.apache.hudi.common.HoodieClientTestUtils;
+import org.apache.hudi.common.HoodieTestDataGenerator;
+import org.apache.hudi.common.model.HoodieTableType;
+import org.apache.hudi.common.table.HoodieTableMetaClient;
+import org.apache.hudi.common.table.timeline.versioning.TimelineLayoutVersion;
+import org.apache.hudi.utilities.HDFSParquetImporter;
+import org.apache.hudi.utilities.TestHDFSParquetImporter;
+import org.apache.hudi.utilities.TestHDFSParquetImporter.HoodieTripModel;
+
+import org.apache.spark.sql.Dataset;
+import org.apache.spark.sql.Row;
+
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.BeforeEach;
+import org.springframework.shell.core.CommandResult;
+
+import java.io.File;
+import java.io.IOException;
+import java.text.ParseException;
+import java.util.List;
+import java.util.stream.Collectors;
+
+import static org.junit.jupiter.api.Assertions.assertAll;
+import static org.junit.jupiter.api.Assertions.assertEquals;
+import static org.junit.jupiter.api.Assertions.assertTrue;
+
+/**
+ * Test class for {@link org.apache.hudi.cli.commands.HDFSParquetImportCommand}.
+ */
+public class ITTestHDFSParquetImportCommand extends AbstractShellIntegrationTest {
+
+  private Path sourcePath;
+  private Path targetPath;
+  private String tableName;
+  private String schemaFile;
+  private String tablePath;
+
+  private List<GenericRecord> insertData;
+  private TestHDFSParquetImporter importer;
+
+  @BeforeEach
+  public void init() throws IOException, ParseException {
+    tableName = "test_table";
+    tablePath = basePath + File.separator + tableName;
+    sourcePath = new Path(basePath, "source");
+    targetPath = new Path(tablePath);
+    schemaFile = new Path(basePath, "file.schema").toString();
+
+    // create schema file
+    try (FSDataOutputStream schemaFileOS = fs.create(new Path(schemaFile))) {
+      schemaFileOS.write(HoodieTestDataGenerator.TRIP_EXAMPLE_SCHEMA.getBytes());
+    }
+
+    importer = new TestHDFSParquetImporter();
+    insertData = importer.createInsertRecords(sourcePath);
+  }
+
+  /**
+   * Test case for 'hdfsparquetimport' with insert.
+   */
+  @Test
+  public void testConvertWithInsert() throws IOException {
+    String command = String.format("hdfsparquetimport --srcPath %s --targetPath %s --tableName %s "
+        + "--tableType %s --rowKeyField %s" + " --partitionPathField %s --parallelism %s "
+        + "--schemaFilePath %s --format %s --sparkMemory %s --retry %s --sparkMaster %s",
+        sourcePath.toString(), targetPath.toString(), tableName, HoodieTableType.COPY_ON_WRITE.name(),
+        "_row_key", "timestamp", "1", schemaFile, "parquet", "2G", "1", "local");
+    CommandResult cr = getShell().executeCommand(command);
+
+    assertAll("Command run success",
+        () -> assertTrue(cr.isSuccess()),
+        () -> assertEquals("Table imported to hoodie format", cr.getResult().toString()));
+
+    // Check hudi table exist
+    String metaPath = targetPath + File.separator + HoodieTableMetaClient.METAFOLDER_NAME;
+    assertTrue(new File(metaPath).exists(), "Hoodie table not exist.");

Review comment:
       use `Files.exists(xx)`?

##########
File path: hudi-cli/src/main/java/org/apache/hudi/cli/commands/HDFSParquetImportCommand.java
##########
@@ -57,6 +56,7 @@ public String convert(
       @CliOption(key = "schemaFilePath", mandatory = true,
           help = "Path for Avro schema file") final String schemaFilePath,
       @CliOption(key = "format", mandatory = true, help = "Format for the input data") final String format,
+      @CliOption(key = "sparkMaster", unspecifiedDefaultValue = "", help = "Spark Master ") String master,

Review comment:
       `"Spark Master "` -> `"Spark Master"` (remove the right empty backspace)

##########
File path: hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java
##########
@@ -82,17 +82,17 @@ public static void main(String[] args) throws Exception {
         break;
       case IMPORT:
       case UPSERT:
-        assert (args.length >= 12);
+        assert (args.length >= 13);
         String propsFilePath = null;
-        if (!StringUtils.isNullOrEmpty(args[11])) {
-          propsFilePath = args[11];
+        if (!StringUtils.isNullOrEmpty(args[12])) {
+          propsFilePath = args[12];
         }
         List<String> configs = new ArrayList<>();
-        if (args.length > 12) {
-          configs.addAll(Arrays.asList(args).subList(12, args.length));
+        if (args.length > 13) {
+          configs.addAll(Arrays.asList(args).subList(13, args.length));
         }
-        returnCode = dataLoad(jsc, command, args[1], args[2], args[3], args[4], args[5], args[6],
-            Integer.parseInt(args[7]), args[8], args[9], Integer.parseInt(args[10]), propsFilePath, configs);
+        returnCode = dataLoad(jsc, command, args[3], args[4], args[5], args[6], args[7], args[8],

Review comment:
       **Please note that I am not talking about you here**. But these array indexes containing numbers are very ugly and unreadable. We should think of a way to improve it.
   
   We should parse all parameters and it is best to do:
   1) Define appropriate variables to store each parameter to improve the readability of the code;
   2) Refactor it, yes, the parse of the parameters should be order-independent;
   
   WDYT? @hddong @vinothchandar 

##########
File path: hudi-cli/src/main/java/org/apache/hudi/cli/commands/HDFSParquetImportCommand.java
##########
@@ -78,8 +76,8 @@ public String convert(
       cmd = SparkCommand.UPSERT.toString();
     }
 
-    sparkLauncher.addAppArgs(cmd, srcPath, targetPath, tableName, tableType, rowKeyField, partitionPathField,
-        parallelism, schemaFilePath, sparkMemory, retry, propsFilePath);
+    sparkLauncher.addAppArgs(cmd, master, sparkMemory, srcPath, targetPath, tableName, tableType, rowKeyField,

Review comment:
       As a refactor suggestion, it would be better to define a data structure to store the cli args to avoid change the signature of the method frequently.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-hudi] garyli1019 commented on pull request #1574: [HUDI-701]Add unit test for HDFSParquetImportCommand

Posted by GitBox <gi...@apache.org>.
garyli1019 commented on pull request #1574:
URL: https://github.com/apache/incubator-hudi/pull/1574#issuecomment-632339112


   Hi @hddong , thanks for your contribution on these tests.
   There are some tests failed in my local build in `hudi-cli` module. I believe it could be related to the run time environment. Do you think we should set up this test in a docker environment?
   
   example stack tracing for `testConvertWithInsert()`
   ```
   expected: <true> but was: <false>
   Comparison Failure: 
   Expected :true
   Actual   :false
   <Click to see difference>
   
   
   
   java.lang.NullPointerException
   	at org.apache.hudi.cli.integ.ITTestHDFSParquetImportCommand.lambda$testConvertWithInsert$1(ITTestHDFSParquetImportCommand.java:100)
   	at org.junit.jupiter.api.AssertAll.lambda$assertAll$1(AssertAll.java:68)
   	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.stream.ReferencePipeline$11$1.accept(ReferencePipeline.java:373)
   	at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at org.junit.jupiter.api.AssertAll.assertAll(AssertAll.java:77)
   	at org.junit.jupiter.api.AssertAll.assertAll(AssertAll.java:44)
   	at org.junit.jupiter.api.Assertions.assertAll(Assertions.java:2856)
   	at org.apache.hudi.cli.integ.ITTestHDFSParquetImportCommand.testConvertWithInsert(ITTestHDFSParquetImportCommand.java:98)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:686)
   	at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:212)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:208)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:137)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:71)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:135)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
   	at java.util.ArrayList.forEach(ArrayList.java:1257)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
   	at java.util.ArrayList.forEach(ArrayList.java:1257)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32)
   	at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
   	at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:248)
   	at org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$5(DefaultLauncher.java:211)
   	at org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:226)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:199)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:132)
   	at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:69)
   	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
   	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
   	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
   
   
   org.opentest4j.MultipleFailuresError: Command run success (2 failures)
   	org.opentest4j.AssertionFailedError: expected: <true> but was: <false>
   	java.lang.NullPointerException: <no message>
   
   	at org.junit.jupiter.api.AssertAll.assertAll(AssertAll.java:80)
   	at org.junit.jupiter.api.AssertAll.assertAll(AssertAll.java:44)
   	at org.junit.jupiter.api.Assertions.assertAll(Assertions.java:2856)
   	at org.apache.hudi.cli.integ.ITTestHDFSParquetImportCommand.testConvertWithInsert(ITTestHDFSParquetImportCommand.java:98)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:686)
   	at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)
   	at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:212)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:208)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:137)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:71)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:135)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
   	at java.util.ArrayList.forEach(ArrayList.java:1257)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
   	at java.util.ArrayList.forEach(ArrayList.java:1257)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32)
   	at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
   	at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:248)
   	at org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$5(DefaultLauncher.java:211)
   	at org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:226)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:199)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:132)
   	at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:69)
   	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
   	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
   	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
   	Suppressed: org.opentest4j.AssertionFailedError: expected: <true> but was: <false>
   		at org.junit.jupiter.api.AssertionUtils.fail(AssertionUtils.java:55)
   		at org.junit.jupiter.api.AssertTrue.assertTrue(AssertTrue.java:40)
   		at org.junit.jupiter.api.AssertTrue.assertTrue(AssertTrue.java:35)
   		at org.junit.jupiter.api.Assertions.assertTrue(Assertions.java:162)
   		at org.apache.hudi.cli.integ.ITTestHDFSParquetImportCommand.lambda$testConvertWithInsert$0(ITTestHDFSParquetImportCommand.java:99)
   		at org.junit.jupiter.api.AssertAll.lambda$assertAll$1(AssertAll.java:68)
   		at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   		at java.util.stream.ReferencePipeline$11$1.accept(ReferencePipeline.java:373)
   		at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
   		at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   		at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   		at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   		at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   		at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   		at org.junit.jupiter.api.AssertAll.assertAll(AssertAll.java:77)
   		... 66 more
   	Suppressed: java.lang.NullPointerException
   		at org.apache.hudi.cli.integ.ITTestHDFSParquetImportCommand.lambda$testConvertWithInsert$1(ITTestHDFSParquetImportCommand.java:100)
   		at org.junit.jupiter.api.AssertAll.lambda$assertAll$1(AssertAll.java:68)
   		at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   		at java.util.stream.ReferencePipeline$11$1.accept(ReferencePipeline.java:373)
   		at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
   		at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   		at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   		at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   		at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   		at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   		at org.junit.jupiter.api.AssertAll.assertAll(AssertAll.java:77)
   		... 66 more
   
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org