You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/08/06 02:08:32 UTC

[GitHub] [hudi] umehrot2 opened a new pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

umehrot2 opened a new pull request #1924:
URL: https://github.com/apache/hudi/pull/1924


   ## What is the purpose of the pull request
   
   This PR improves the performance of Hudi Bootstrap, by optimizing the listing of source partitions/files using spark parallelism.
   
   ## Brief change log
   
   - Updated `BootstrapUtils` to use spark context to list source partitions/files
   - Other changes resulting from API change in `BootstrapUtils` to accept `spark context`
   
   ## Verify this pull request
   
   - Existing Bootstrap unit tests
   
   ## Committer checklist
   
    - [ ] Has a corresponding JIRA in PR title & commit
    
    - [ ] Commit message is descriptive of the change
    
    - [ ] CI is green
   
    - [ ] Necessary doc changes done or have another open PR
          
    - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] vinothchandar commented on a change in pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

Posted by GitBox <gi...@apache.org>.
vinothchandar commented on a change in pull request #1924:
URL: https://github.com/apache/hudi/pull/1924#discussion_r466850903



##########
File path: hudi-client/src/main/java/org/apache/hudi/table/action/bootstrap/BootstrapUtils.java
##########
@@ -41,37 +48,87 @@
    * Returns leaf folders with files under a path.
    * @param fs  File System
    * @param basePathStr Base Path to look for leaf folders
-   * @param filePathFilter  Filters to skip directories/paths
+   * @param jsc Java spark context
    * @return list of partition paths with files under them.
    * @throws IOException
    */
   public static List<Pair<String, List<HoodieFileStatus>>> getAllLeafFoldersWithFiles(FileSystem fs, String basePathStr,
-                                                                                      PathFilter filePathFilter) throws IOException {
+      JavaSparkContext jsc) throws IOException {
     final Path basePath = new Path(basePathStr);
     final Map<Integer, List<String>> levelToPartitions = new HashMap<>();
     final Map<String, List<HoodieFileStatus>> partitionToFiles = new HashMap<>();
-    FSUtils.processFiles(fs, basePathStr, (status) -> {
-      if (status.isFile() && filePathFilter.accept(status.getPath())) {
-        String relativePath = FSUtils.getRelativePartitionPath(basePath, status.getPath().getParent());
-        List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
-        if (null == statusList) {
-          Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
-          List<String> dirs = levelToPartitions.get(level);
-          if (null == dirs) {
-            dirs = new ArrayList<>();
-            levelToPartitions.put(level, dirs);
+    PathFilter filePathFilter = getFilePathFilter();
+    PathFilter metaPathFilter = getExcludeMetaPathFilter();
+
+    FileStatus[] topLevelStatuses = fs.listStatus(new Path(basePathStr));
+    List<String> subDirectories = new ArrayList<>();
+
+    List<Pair<HoodieFileStatus, Pair<Integer, String>>> result = new ArrayList<>();

Review comment:
       Sounds good . @umehrot2 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] umehrot2 commented on a change in pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

Posted by GitBox <gi...@apache.org>.
umehrot2 commented on a change in pull request #1924:
URL: https://github.com/apache/hudi/pull/1924#discussion_r466779443



##########
File path: hudi-client/src/main/java/org/apache/hudi/table/action/bootstrap/BootstrapUtils.java
##########
@@ -41,37 +48,87 @@
    * Returns leaf folders with files under a path.
    * @param fs  File System
    * @param basePathStr Base Path to look for leaf folders
-   * @param filePathFilter  Filters to skip directories/paths
+   * @param jsc Java spark context
    * @return list of partition paths with files under them.
    * @throws IOException
    */
   public static List<Pair<String, List<HoodieFileStatus>>> getAllLeafFoldersWithFiles(FileSystem fs, String basePathStr,
-                                                                                      PathFilter filePathFilter) throws IOException {
+      JavaSparkContext jsc) throws IOException {
     final Path basePath = new Path(basePathStr);
     final Map<Integer, List<String>> levelToPartitions = new HashMap<>();
     final Map<String, List<HoodieFileStatus>> partitionToFiles = new HashMap<>();
-    FSUtils.processFiles(fs, basePathStr, (status) -> {
-      if (status.isFile() && filePathFilter.accept(status.getPath())) {
-        String relativePath = FSUtils.getRelativePartitionPath(basePath, status.getPath().getParent());
-        List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
-        if (null == statusList) {
-          Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
-          List<String> dirs = levelToPartitions.get(level);
-          if (null == dirs) {
-            dirs = new ArrayList<>();
-            levelToPartitions.put(level, dirs);
+    PathFilter filePathFilter = getFilePathFilter();
+    PathFilter metaPathFilter = getExcludeMetaPathFilter();
+
+    FileStatus[] topLevelStatuses = fs.listStatus(new Path(basePathStr));
+    List<String> subDirectories = new ArrayList<>();
+
+    List<Pair<HoodieFileStatus, Pair<Integer, String>>> result = new ArrayList<>();

Review comment:
       Only the outer structure is a bit similar in terms of first listing and taking action on top level files, and then using `spark context` to perform the same action on sub-directories in parallel. But the inner logic is different and values being collected are different.
   
   If we really want to re-use the common outer logic, it would require exploring extracting out the inner logic into `serializable functions` that would work fine with `spark context` as well. So, to not over-complicate this PR I can explore this separately if its okay. I have create a new Jira https://issues.apache.org/jira/browse/HUDI-1158 where I have listed the two optimizations we discussed about w.r.t to parallel listing behavior:
   
   - The parallelization should be at leaf partition directory level and not just at the top directory level
   - Extract out common code paths




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] vinothchandar merged pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

Posted by GitBox <gi...@apache.org>.
vinothchandar merged pull request #1924:
URL: https://github.com/apache/hudi/pull/1924


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] umehrot2 commented on a change in pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

Posted by GitBox <gi...@apache.org>.
umehrot2 commented on a change in pull request #1924:
URL: https://github.com/apache/hudi/pull/1924#discussion_r466779443



##########
File path: hudi-client/src/main/java/org/apache/hudi/table/action/bootstrap/BootstrapUtils.java
##########
@@ -41,37 +48,87 @@
    * Returns leaf folders with files under a path.
    * @param fs  File System
    * @param basePathStr Base Path to look for leaf folders
-   * @param filePathFilter  Filters to skip directories/paths
+   * @param jsc Java spark context
    * @return list of partition paths with files under them.
    * @throws IOException
    */
   public static List<Pair<String, List<HoodieFileStatus>>> getAllLeafFoldersWithFiles(FileSystem fs, String basePathStr,
-                                                                                      PathFilter filePathFilter) throws IOException {
+      JavaSparkContext jsc) throws IOException {
     final Path basePath = new Path(basePathStr);
     final Map<Integer, List<String>> levelToPartitions = new HashMap<>();
     final Map<String, List<HoodieFileStatus>> partitionToFiles = new HashMap<>();
-    FSUtils.processFiles(fs, basePathStr, (status) -> {
-      if (status.isFile() && filePathFilter.accept(status.getPath())) {
-        String relativePath = FSUtils.getRelativePartitionPath(basePath, status.getPath().getParent());
-        List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
-        if (null == statusList) {
-          Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
-          List<String> dirs = levelToPartitions.get(level);
-          if (null == dirs) {
-            dirs = new ArrayList<>();
-            levelToPartitions.put(level, dirs);
+    PathFilter filePathFilter = getFilePathFilter();
+    PathFilter metaPathFilter = getExcludeMetaPathFilter();
+
+    FileStatus[] topLevelStatuses = fs.listStatus(new Path(basePathStr));
+    List<String> subDirectories = new ArrayList<>();
+
+    List<Pair<HoodieFileStatus, Pair<Integer, String>>> result = new ArrayList<>();

Review comment:
       Only the outer structure is a bit similar in terms of first listing and taking action on top level files, and then using `spark context` to perform the same action on sub-directories in parallel. But the inner logic is different and values being collected are different.
   
   If we really want to re-use the common outer logic, it would require exploring extracting out the inner logic into `serializable functions` that would work fine with `spark context` as well. So, to not over-complicate this PR I can explore this separately if its okay. I have created a new Jira https://issues.apache.org/jira/browse/HUDI-1158 where I have listed the two optimizations we discussed about w.r.t to parallel listing behavior:
   
   - The parallelization should be at leaf partition directory level and not just at the top directory level
   - Extract out common code paths




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] vinothchandar commented on a change in pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

Posted by GitBox <gi...@apache.org>.
vinothchandar commented on a change in pull request #1924:
URL: https://github.com/apache/hudi/pull/1924#discussion_r466217168



##########
File path: hudi-client/src/main/java/org/apache/hudi/table/action/bootstrap/BootstrapUtils.java
##########
@@ -41,37 +48,87 @@
    * Returns leaf folders with files under a path.
    * @param fs  File System
    * @param basePathStr Base Path to look for leaf folders
-   * @param filePathFilter  Filters to skip directories/paths
+   * @param jsc Java spark context
    * @return list of partition paths with files under them.
    * @throws IOException
    */
   public static List<Pair<String, List<HoodieFileStatus>>> getAllLeafFoldersWithFiles(FileSystem fs, String basePathStr,
-                                                                                      PathFilter filePathFilter) throws IOException {
+      JavaSparkContext jsc) throws IOException {
     final Path basePath = new Path(basePathStr);
     final Map<Integer, List<String>> levelToPartitions = new HashMap<>();
     final Map<String, List<HoodieFileStatus>> partitionToFiles = new HashMap<>();
-    FSUtils.processFiles(fs, basePathStr, (status) -> {
-      if (status.isFile() && filePathFilter.accept(status.getPath())) {
-        String relativePath = FSUtils.getRelativePartitionPath(basePath, status.getPath().getParent());
-        List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
-        if (null == statusList) {
-          Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
-          List<String> dirs = levelToPartitions.get(level);
-          if (null == dirs) {
-            dirs = new ArrayList<>();
-            levelToPartitions.put(level, dirs);
+    PathFilter filePathFilter = getFilePathFilter();
+    PathFilter metaPathFilter = getExcludeMetaPathFilter();
+
+    FileStatus[] topLevelStatuses = fs.listStatus(new Path(basePathStr));
+    List<String> subDirectories = new ArrayList<>();
+
+    List<Pair<HoodieFileStatus, Pair<Integer, String>>> result = new ArrayList<>();
+
+    for (FileStatus topLevelStatus: topLevelStatuses) {
+      if (topLevelStatus.isFile() && filePathFilter.accept(topLevelStatus.getPath())) {
+        String relativePath = FSUtils.getRelativePartitionPath(basePath, topLevelStatus.getPath().getParent());
+        Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
+        HoodieFileStatus hoodieFileStatus = FileStatusUtils.fromFileStatus(topLevelStatus);
+        result.add(Pair.of(hoodieFileStatus, Pair.of(level, relativePath)));
+      } else if (metaPathFilter.accept(topLevelStatus.getPath())) {
+        subDirectories.add(topLevelStatus.getPath().toString());
+      }
+    }
+
+    if (subDirectories.size() > 0) {
+      result.addAll(jsc.parallelize(subDirectories, subDirectories.size()).flatMap(directory -> {
+        PathFilter pathFilter = getFilePathFilter();
+        Path path = new Path(directory);
+        FileSystem fileSystem = path.getFileSystem(new Configuration());
+        RemoteIterator<LocatedFileStatus> itr = fileSystem.listFiles(path, true);
+        List<Pair<HoodieFileStatus, Pair<Integer, String>>> res = new ArrayList<>();
+        while (itr.hasNext()) {
+          FileStatus status = itr.next();
+          if (pathFilter.accept(status.getPath())) {
+            String relativePath = FSUtils.getRelativePartitionPath(new Path(basePathStr), status.getPath().getParent());
+            Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
+            HoodieFileStatus hoodieFileStatus = FileStatusUtils.fromFileStatus(status);
+            res.add(Pair.of(hoodieFileStatus, Pair.of(level, relativePath)));
           }
-          dirs.add(relativePath);
-          statusList = new ArrayList<>();
-          partitionToFiles.put(relativePath, statusList);
         }
-        statusList.add(FileStatusUtils.fromFileStatus(status));
+        return res.iterator();
+      }).collect());
+    }
+
+    result.forEach(val -> {
+      String relativePath = val.getRight().getRight();
+      List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
+      if (null == statusList) {
+        Integer level = val.getRight().getLeft();
+        List<String> dirs = levelToPartitions.get(level);
+        if (null == dirs) {
+          dirs = new ArrayList<>();
+          levelToPartitions.put(level, dirs);
+        }
+        dirs.add(relativePath);
+        statusList = new ArrayList<>();
+        partitionToFiles.put(relativePath, statusList);
       }
-      return true;
-    }, true);
+      statusList.add(val.getLeft());
+    });
+
     OptionalInt maxLevelOpt = levelToPartitions.keySet().stream().mapToInt(x -> x).max();
     int maxLevel = maxLevelOpt.orElse(-1);
     return maxLevel >= 0 ? levelToPartitions.get(maxLevel).stream()
-        .map(d -> Pair.of(d, partitionToFiles.get(d))).collect(Collectors.toList()) : new ArrayList<>();
+            .map(d -> Pair.of(d, partitionToFiles.get(d))).collect(Collectors.toList()) : new ArrayList<>();
+  }
+
+  private static PathFilter getFilePathFilter() {
+    return (path) -> {
+      // TODO: Needs to be abstracted out when supporting different formats
+      // TODO: Remove hoodieFilter
+      return path.getName().endsWith(HoodieFileFormat.PARQUET.getFileExtension());

Review comment:
       can we just use the table's base file format here?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] umehrot2 commented on a change in pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

Posted by GitBox <gi...@apache.org>.
umehrot2 commented on a change in pull request #1924:
URL: https://github.com/apache/hudi/pull/1924#discussion_r466774810



##########
File path: hudi-client/src/main/java/org/apache/hudi/table/action/bootstrap/BootstrapUtils.java
##########
@@ -41,37 +48,87 @@
    * Returns leaf folders with files under a path.
    * @param fs  File System
    * @param basePathStr Base Path to look for leaf folders
-   * @param filePathFilter  Filters to skip directories/paths
+   * @param jsc Java spark context
    * @return list of partition paths with files under them.
    * @throws IOException
    */
   public static List<Pair<String, List<HoodieFileStatus>>> getAllLeafFoldersWithFiles(FileSystem fs, String basePathStr,
-                                                                                      PathFilter filePathFilter) throws IOException {
+      JavaSparkContext jsc) throws IOException {
     final Path basePath = new Path(basePathStr);
     final Map<Integer, List<String>> levelToPartitions = new HashMap<>();
     final Map<String, List<HoodieFileStatus>> partitionToFiles = new HashMap<>();
-    FSUtils.processFiles(fs, basePathStr, (status) -> {
-      if (status.isFile() && filePathFilter.accept(status.getPath())) {
-        String relativePath = FSUtils.getRelativePartitionPath(basePath, status.getPath().getParent());
-        List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
-        if (null == statusList) {
-          Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
-          List<String> dirs = levelToPartitions.get(level);
-          if (null == dirs) {
-            dirs = new ArrayList<>();
-            levelToPartitions.put(level, dirs);
+    PathFilter filePathFilter = getFilePathFilter();
+    PathFilter metaPathFilter = getExcludeMetaPathFilter();
+
+    FileStatus[] topLevelStatuses = fs.listStatus(new Path(basePathStr));
+    List<String> subDirectories = new ArrayList<>();
+
+    List<Pair<HoodieFileStatus, Pair<Integer, String>>> result = new ArrayList<>();
+
+    for (FileStatus topLevelStatus: topLevelStatuses) {
+      if (topLevelStatus.isFile() && filePathFilter.accept(topLevelStatus.getPath())) {
+        String relativePath = FSUtils.getRelativePartitionPath(basePath, topLevelStatus.getPath().getParent());
+        Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
+        HoodieFileStatus hoodieFileStatus = FileStatusUtils.fromFileStatus(topLevelStatus);
+        result.add(Pair.of(hoodieFileStatus, Pair.of(level, relativePath)));
+      } else if (metaPathFilter.accept(topLevelStatus.getPath())) {
+        subDirectories.add(topLevelStatus.getPath().toString());
+      }
+    }
+
+    if (subDirectories.size() > 0) {
+      result.addAll(jsc.parallelize(subDirectories, subDirectories.size()).flatMap(directory -> {
+        PathFilter pathFilter = getFilePathFilter();
+        Path path = new Path(directory);
+        FileSystem fileSystem = path.getFileSystem(new Configuration());
+        RemoteIterator<LocatedFileStatus> itr = fileSystem.listFiles(path, true);
+        List<Pair<HoodieFileStatus, Pair<Integer, String>>> res = new ArrayList<>();
+        while (itr.hasNext()) {
+          FileStatus status = itr.next();
+          if (pathFilter.accept(status.getPath())) {
+            String relativePath = FSUtils.getRelativePartitionPath(new Path(basePathStr), status.getPath().getParent());
+            Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
+            HoodieFileStatus hoodieFileStatus = FileStatusUtils.fromFileStatus(status);
+            res.add(Pair.of(hoodieFileStatus, Pair.of(level, relativePath)));
           }
-          dirs.add(relativePath);
-          statusList = new ArrayList<>();
-          partitionToFiles.put(relativePath, statusList);
         }
-        statusList.add(FileStatusUtils.fromFileStatus(status));
+        return res.iterator();
+      }).collect());
+    }
+
+    result.forEach(val -> {
+      String relativePath = val.getRight().getRight();
+      List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
+      if (null == statusList) {
+        Integer level = val.getRight().getLeft();
+        List<String> dirs = levelToPartitions.get(level);
+        if (null == dirs) {
+          dirs = new ArrayList<>();
+          levelToPartitions.put(level, dirs);
+        }
+        dirs.add(relativePath);
+        statusList = new ArrayList<>();
+        partitionToFiles.put(relativePath, statusList);
       }
-      return true;
-    }, true);
+      statusList.add(val.getLeft());
+    });
+
     OptionalInt maxLevelOpt = levelToPartitions.keySet().stream().mapToInt(x -> x).max();
     int maxLevel = maxLevelOpt.orElse(-1);
     return maxLevel >= 0 ? levelToPartitions.get(maxLevel).stream()
-        .map(d -> Pair.of(d, partitionToFiles.get(d))).collect(Collectors.toList()) : new ArrayList<>();
+            .map(d -> Pair.of(d, partitionToFiles.get(d))).collect(Collectors.toList()) : new ArrayList<>();
+  }
+
+  private static PathFilter getFilePathFilter() {
+    return (path) -> {
+      // TODO: Needs to be abstracted out when supporting different formats
+      // TODO: Remove hoodieFilter
+      return path.getName().endsWith(HoodieFileFormat.PARQUET.getFileExtension());

Review comment:
       Done.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] vinothchandar commented on a change in pull request #1924: [HUDI-999][Performance] Parallelize fetching of bootstrap source data files/partitions

Posted by GitBox <gi...@apache.org>.
vinothchandar commented on a change in pull request #1924:
URL: https://github.com/apache/hudi/pull/1924#discussion_r466216468



##########
File path: hudi-client/src/main/java/org/apache/hudi/table/action/bootstrap/BootstrapUtils.java
##########
@@ -41,37 +48,87 @@
    * Returns leaf folders with files under a path.
    * @param fs  File System
    * @param basePathStr Base Path to look for leaf folders
-   * @param filePathFilter  Filters to skip directories/paths
+   * @param jsc Java spark context
    * @return list of partition paths with files under them.
    * @throws IOException
    */
   public static List<Pair<String, List<HoodieFileStatus>>> getAllLeafFoldersWithFiles(FileSystem fs, String basePathStr,
-                                                                                      PathFilter filePathFilter) throws IOException {
+      JavaSparkContext jsc) throws IOException {
     final Path basePath = new Path(basePathStr);
     final Map<Integer, List<String>> levelToPartitions = new HashMap<>();
     final Map<String, List<HoodieFileStatus>> partitionToFiles = new HashMap<>();
-    FSUtils.processFiles(fs, basePathStr, (status) -> {
-      if (status.isFile() && filePathFilter.accept(status.getPath())) {
-        String relativePath = FSUtils.getRelativePartitionPath(basePath, status.getPath().getParent());
-        List<HoodieFileStatus> statusList = partitionToFiles.get(relativePath);
-        if (null == statusList) {
-          Integer level = (int) relativePath.chars().filter(ch -> ch == '/').count();
-          List<String> dirs = levelToPartitions.get(level);
-          if (null == dirs) {
-            dirs = new ArrayList<>();
-            levelToPartitions.put(level, dirs);
+    PathFilter filePathFilter = getFilePathFilter();
+    PathFilter metaPathFilter = getExcludeMetaPathFilter();
+
+    FileStatus[] topLevelStatuses = fs.listStatus(new Path(basePathStr));
+    List<String> subDirectories = new ArrayList<>();
+
+    List<Pair<HoodieFileStatus, Pair<Integer, String>>> result = new ArrayList<>();

Review comment:
       we had some very similar code for marker dir listing? can we see if we can reuse some code here across them?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org