You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/01/21 16:38:39 UTC

[GitHub] [incubator-hudi] goldfix opened a new issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n

goldfix opened a new issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n
URL: https://github.com/apache/incubator-hudi/issues/1266
 
 
   Hello,
   
   When try to upload data to S3 return an exception.
   
   **To Reproduce**
   
   ```
   spark-shell --packages com.amazonaws:aws-java-sdk:1.10.34,org.apache.hadoop:hadoop-aws:2.7.3,org.apache.hudi:hudi-spark-bundle:0.5.0-incubating --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf "spark.sql.hive.convertMetastoreParquet=false" --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3native.NativeS3FileSystem" --conf "spark.hadoop.fs.s3.awsSecretAccessKey=<XYZ>" --conf "spark.hadoop.fs.s3.awsAccessKeyId=<XYZ>" --conf "spark.hadoop.fs.defaultFS=s3://my-bucket"
   ```
   
   Code:
   
   ```import org.apache.hudi.QuickstartUtils._
   import scala.collection.JavaConversions._
   import org.apache.spark.sql.SaveMode._
   import org.apache.hudi.DataSourceReadOptions._
   import org.apache.hudi.DataSourceWriteOptions._
   import org.apache.hudi.config.HoodieWriteConfig._
   import org.apache.hudi.hive.MultiPartKeysValueExtractor
   import org.apache.spark.sql.functions._
   
   val tableName = "hudi_cow_table"
   val basePath = "s3://my-bucket/hudi_cow_table"
   val dataGen = new DataGenerator
   
   
   val inserts = convertToStringList(dataGen.generateInserts(10))
   val df = spark.read.json(spark.sparkContext.parallelize(inserts, 2))
   
   df.write.format("org.apache.hudi").
       options(getQuickstartWriteConfigs).
       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
       option(TABLE_NAME, tableName).
       mode(Overwrite).
       save(basePath)
   ```
   
   **Environment Description**
   
   * Hudi version : org.apache.hudi:hudi-spark-bundle:0.5.0-incubating
   
   * Spark version : 2.4.4
   
   * Hadoop version : 2.7
   
   * Storage (HDFS/S3/GCS..) : S3
   
   * Running on Docker? (yes/no) : no
   
   **Stacktrace**
   
   ```
   java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   20/01/21 17:31:36 ERROR BoundedInMemoryExecutor: error consuming records
   java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   20/01/21 17:31:36 ERROR BoundedInMemoryExecutor: error consuming records
   java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   20/01/21 17:31:36 WARN BlockManager: Putting block rdd_49_2 failed due to exception java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n.
   20/01/21 17:31:36 WARN BlockManager: Putting block rdd_49_1 failed due to exception java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n.
   20/01/21 17:31:36 WARN BlockManager: Block rdd_49_2 could not be removed as it was not found on disk or in memory
   20/01/21 17:31:36 WARN BlockManager: Block rdd_49_1 could not be removed as it was not found on disk or in memory
   20/01/21 17:31:36 WARN BlockManager: Putting block rdd_49_0 failed due to exception java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n.
   20/01/21 17:31:36 WARN BlockManager: Block rdd_49_0 could not be removed as it was not found on disk or in memory
   20/01/21 17:31:36 ERROR Executor: Exception in task 0.0 in stage 21.0 (TID 25)
   java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:122)
           at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
           at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
           at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
           at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
           at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
           at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
           at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
           at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
           at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:123)
           at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:103)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:42)
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:120)
           ... 23 more
   Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:142)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:99)
           ... 25 more
   Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at java.util.concurrent.FutureTask.report(FutureTask.java:122)
           at java.util.concurrent.FutureTask.get(FutureTask.java:192)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:140)
           ... 26 more
   Caused by: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   20/01/21 17:31:36 ERROR Executor: Exception in task 1.0 in stage 21.0 (TID 26)
   java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:122)
           at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
           at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
           at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
           at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
           at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
           at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
           at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
           at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
           at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:123)
           at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:103)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:42)
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:120)
           ... 23 more
   Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:142)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:99)
           ... 25 more
   Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at java.util.concurrent.FutureTask.report(FutureTask.java:122)
           at java.util.concurrent.FutureTask.get(FutureTask.java:192)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:140)
           ... 26 more
   Caused by: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   20/01/21 17:31:36 ERROR Executor: Exception in task 2.0 in stage 21.0 (TID 27)
   java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:122)
           at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
           at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
           at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
           at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
           at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
           at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
           at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
           at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
           at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:123)
           at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:103)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:42)
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:120)
           ... 23 more
   Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:142)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:99)
           ... 25 more
   Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at java.util.concurrent.FutureTask.report(FutureTask.java:122)
           at java.util.concurrent.FutureTask.get(FutureTask.java:192)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:140)
           ... 26 more
   Caused by: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   20/01/21 17:31:36 WARN TaskSetManager: Lost task 1.0 in stage 21.0 (TID 26, localhost, executor driver): java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:122)
           at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
           at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
           at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
           at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
           at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
           at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
           at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
           at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
           at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:123)
           at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:103)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:42)
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:120)
           ... 23 more
   Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:142)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:99)
           ... 25 more
   Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at java.util.concurrent.FutureTask.report(FutureTask.java:122)
           at java.util.concurrent.FutureTask.get(FutureTask.java:192)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:140)
           ... 26 more
   Caused by: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   
   20/01/21 17:31:36 ERROR TaskSetManager: Task 1 in stage 21.0 failed 1 times; aborting job
   [Stage 21:>                                                         (0 + 2) / 3]org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 21.0 failed 1 times, most recent failure: Lost task 1.0 in stage 21.0 (TID 26, localhost, executor driver): java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:122)
           at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
           at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
           at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
           at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
           at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
           at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
           at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
           at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
           at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
           at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
           at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:123)
           at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:103)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:42)
           at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:120)
           ... 23 more
   Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:142)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:99)
           ... 25 more
   Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at java.util.concurrent.FutureTask.report(FutureTask.java:122)
           at java.util.concurrent.FutureTask.get(FutureTask.java:192)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:140)
           ... 26 more
   Caused by: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
           at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
           at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
           at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
           at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
           at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
           at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
           at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           ... 3 more
   
   Driver stacktrace:
     at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
     at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
     at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
     at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
     at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
     at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
     at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
     at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
     at scala.Option.foreach(Option.scala:257)
     at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
     at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
     at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
     at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
     at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
     at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
     at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
     at org.apache.spark.rdd.RDD.count(RDD.scala:1168)
     at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:145)
     at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
     at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
     at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
     at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
     at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
     at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
     at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
     at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
     at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
     at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
     at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
     at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
     at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
     at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
     at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
     at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
     at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
     at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
     at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
     at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
     at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
     ... 70 elided
   Caused by: java.lang.RuntimeException: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
     at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:122)
     at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
     at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
     at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
     at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
     at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349)
     at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
     at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
     at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
     at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
     at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
     at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
     at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
     at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
     at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
     at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
     at org.apache.spark.scheduler.Task.run(Task.scala:123)
     at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
     at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
     at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
     at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
     at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:103)
     at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:42)
     at org.apache.hudi.func.LazyIterableIterator.next(LazyIterableIterator.java:120)
     ... 23 more
   Caused by: org.apache.hudi.exception.HoodieException: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
     at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:142)
     at org.apache.hudi.func.CopyOnWriteLazyInsertIterable.computeNext(CopyOnWriteLazyInsertIterable.java:99)
     ... 25 more
   Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
     at java.util.concurrent.FutureTask.report(FutureTask.java:122)
     at java.util.concurrent.FutureTask.get(FutureTask.java:192)
     at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:140)
     ... 26 more
   Caused by: java.lang.IllegalArgumentException: BlockAlignedAvroParquetWriter does not support scheme s3n
     at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.getHoodieScheme(HoodieWrapperFileSystem.java:109)
     at org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.convertToHoodiePath(HoodieWrapperFileSystem.java:85)
     at org.apache.hudi.io.storage.HoodieParquetWriter.<init>(HoodieParquetWriter.java:57)
     at org.apache.hudi.io.storage.HoodieStorageWriterFactory.newParquetStorageWriter(HoodieStorageWriterFactory.java:60)
     at org.apache.hudi.io.storage.HoodieStorageWriterFactory.getStorageWriter(HoodieStorageWriterFactory.java:44)
     at org.apache.hudi.io.HoodieCreateHandle.<init>(HoodieCreateHandle.java:70)
     at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:137)
     at org.apache.hudi.func.CopyOnWriteLazyInsertIterable$CopyOnWriteInsertHandler.consumeOneRecord(CopyOnWriteLazyInsertIterable.java:125)
     at org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:38)
     at org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$2(BoundedInMemoryExecutor.java:120)
     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
     ... 3 more
   ```
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-hudi] goldfix commented on issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n

Posted by GitBox <gi...@apache.org>.
goldfix commented on issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n
URL: https://github.com/apache/incubator-hudi/issues/1266#issuecomment-577308005
 
 
   Hi @vinothchandar and thanks.
   
   I tried to correct my spark-shell command and solved:
   
   ```
   spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.7,org.apache.hudi:hudi-spark-bundle:0.5.0-incubating --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf "spark.sql.hive.convertMetastoreParquet=false" --conf "spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" --conf "spark.hadoop.fs.s3a.secret.key=XYZ" --conf "spark.hadoop.fs.s3a.access.key=ABC" --conf "spark.hadoop.fs.defaultFS=s3a://my-s3-buckets"
   ```
   
   Thanks
   
   ciao
   p
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-hudi] vinothchandar commented on issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n

Posted by GitBox <gi...@apache.org>.
vinothchandar commented on issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n
URL: https://github.com/apache/incubator-hudi/issues/1266#issuecomment-576808426
 
 
   Might be because we only whitelist `s3` and `s3a`. are you able to use those? I think s3a is ,more performant anyway? 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [incubator-hudi] goldfix closed issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n

Posted by GitBox <gi...@apache.org>.
goldfix closed issue #1266: BlockAlignedAvroParquetWriter does not support scheme s3n
URL: https://github.com/apache/incubator-hudi/issues/1266
 
 
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services