You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by "RussellSpitzer (via GitHub)" <gi...@apache.org> on 2023/06/21 18:19:05 UTC

[GitHub] [iceberg] RussellSpitzer commented on a diff in pull request #7744: Core, Spark: Fix migrate table in case of partitioned table with partition containing a special character

RussellSpitzer commented on code in PR #7744:
URL: https://github.com/apache/iceberg/pull/7744#discussion_r1237406627


##########
data/src/main/java/org/apache/iceberg/data/TableMigrationUtil.java:
##########
@@ -162,40 +182,44 @@ private static Metrics getAvroMetrics(Path path, Configuration conf) {
   }
 
   private static Metrics getParquetMetrics(
-      Path path, Configuration conf, MetricsConfig metricsSpec, NameMapping mapping) {
+      Path path, Configuration conf, MetricsConfig metricsConfig, NameMapping mapping) {
     try {
       InputFile file = HadoopInputFile.fromPath(path, conf);
-      return ParquetUtil.fileMetrics(file, metricsSpec, mapping);
+      return ParquetUtil.fileMetrics(file, metricsConfig, mapping);
     } catch (UncheckedIOException e) {
       throw new RuntimeException("Unable to read the metrics of the Parquet file: " + path, e);
     }
   }
 
   private static Metrics getOrcMetrics(
-      Path path, Configuration conf, MetricsConfig metricsSpec, NameMapping mapping) {
+      Path path, Configuration conf, MetricsConfig metricsConfig, NameMapping mapping) {
     try {
-      return OrcMetrics.fromInputFile(HadoopInputFile.fromPath(path, conf), metricsSpec, mapping);
+      return OrcMetrics.fromInputFile(HadoopInputFile.fromPath(path, conf), metricsConfig, mapping);
     } catch (UncheckedIOException e) {
       throw new RuntimeException("Unable to read the metrics of the Orc file: " + path, e);
     }
   }
 
   private static DataFile buildDataFile(
-      FileStatus stat, String partitionKey, PartitionSpec spec, Metrics metrics, String format) {
+      FileStatus stat,
+      List<String> partitionValues,
+      PartitionSpec spec,
+      Metrics metrics,
+      String format) {
     return DataFiles.builder(spec)
         .withPath(stat.getPath().toString())
         .withFormat(format)
         .withFileSizeInBytes(stat.getLen())
         .withMetrics(metrics)
-        .withPartitionPath(partitionKey)
+        .withPartitionValues(partitionValues)
         .build();
   }
 
-  private static ExecutorService migrationService(int concurrentDeletes) {
+  private static ExecutorService migrationService(int numThreads) {

Review Comment:
   why is this renamed?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org