You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "stantaov (via GitHub)" <gi...@apache.org> on 2023/02/08 17:36:29 UTC

[GitHub] [hudi] stantaov opened a new issue, #7899: [SUPPORT] Error "Could not create interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?" While Deleting Data

stantaov opened a new issue, #7899:
URL: https://github.com/apache/hudi/issues/7899

   **Describe the problem you faced**
   
   Getting the following exception when  trying to delete data from a hudi table. 
   
   Caused by: java.lang.RuntimeException: Could not create  interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. Submit spark job with -> spark.sql("DELETE from XXXX WHERE XXXX") 
   
   **Expected behavior**
   
   We expect specified rows to be deleted form the table. 
   
   **Environment Description**
   
   * Hudi version : 0.12.2
   
   * Spark version : 3.2.1
   
   * Hive version : 3.1.3000.7.1.7.0-551
   
   * Hadoop version : 3.1.1.7.1.7.0-551
   
   * Storage (HDFS/S3/GCS..) : HDFS
   
   * Running on Docker? (yes/no) : no
   
   
   **Additional context**
   
   Add any other context about the problem here.
   Jars used
   
   ```
   "org.apache.spark" %% "spark-sql" % "3.2.1",
   "org.apache.hudi" %% "hudi-spark3.2-bundle" % "0.12.2"
   
   ```
   Spark submit command 
   
   ```
   spark3-submit --master yarn --deploy-mode cluster --queue XXXX --executor-memory 18G --driver-memory 16G --num-executors 4 --executor-cores 2 --conf spark.yarn.max.executor.failures=10 --conf spark.sql.shuffle.partitions=100 --conf spark.rdd.compress=true --name XXXXX --conf spark.scheduler.mode=FAIR --conf spark.executor.memoryOverhead=4096 --conf spark.driver.memoryOverhead=2048 --class XXXXX /tmp/XXXXX.jar --jars /tmp/XXXXXX.jar  -DtableName="XXXX" -DtableRange="XXXX"
   ```
   Spark Session 
   
   ```
   trait SparkSessionWrapper {
     lazy val spark: SparkSession = SparkSession
       .builder
       .appName("XXXXXX")
       .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
       .config("spark.sql.extensions", "org.apache.spark.sql.hudi.HoodieSparkSessionExtension")
       .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.hudi.catalog.HoodieCatalog")
       .config("spark.sql.hive.convertMetastoreParquet", "false")
       .enableHiveSupport()
       .getOrCreate()
   
   }
   ```
   
   
   **Stacktrace**
   
   ```
   23/02/08 17:18:35 INFO yarn.Client: 
            client token: Token { kind: YARN_CLIENT_TOKEN, service:  }
            diagnostics: User class threw exception: org.apache.hudi.exception.HoodieUpsertException: Failed to delete for commit time 20230208171759453
           at org.apache.hudi.table.action.commit.HoodieDeleteHelper.execute(HoodieDeleteHelper.java:113)
           at org.apache.hudi.table.action.commit.SparkDeleteCommitActionExecutor.execute(SparkDeleteCommitActionExecutor.java:45)
           at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.delete(HoodieSparkCopyOnWriteTable.java:130)
           at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.delete(HoodieSparkCopyOnWriteTable.java:97)
           at org.apache.hudi.client.SparkRDDWriteClient.delete(SparkRDDWriteClient.java:262)
           at org.apache.hudi.DataSourceUtils.doDeleteOperation(DataSourceUtils.java:218)
           at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:213)
           at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:144)
           at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
           at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
           at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
           at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:129)
           at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:856)
           at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
           at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
           at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
           at org.apache.spark.sql.hudi.command.DeleteHoodieTableCommand.run(DeleteHoodieTableCommand.scala:48)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
           at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
           at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
           at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
           at com.telus.argus.streaming.StreamingJob$.delayedEndpoint$com$telus$argus$streaming$StreamingJob$1(StreamingJob.scala:8)
           at com.telus.argus.streaming.StreamingJob$delayedInit$body.apply(StreamingJob.scala:6)
           at scala.Function0.apply$mcV$sp(Function0.scala:39)
           at scala.Function0.apply$mcV$sp$(Function0.scala:39)
           at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
           at scala.App.$anonfun$main$1$adapted(App.scala:80)
           at scala.collection.immutable.List.foreach(List.scala:392)
           at scala.App.main(App.scala:80)
           at scala.App.main$(App.scala:78)
           at com.telus.argus.streaming.StreamingJob$.main(StreamingJob.scala:6)
           at com.telus.argus.streaming.StreamingJob.main(StreamingJob.scala)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:737)
   Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 10.0 failed 4 times, most recent failure: Lost task 0.3 in stage 10.0 (TID 428) (tslp000992.oss.ads executor 7): java.lang.ExceptionInInitializerError
           at org.apache.hudi.io.storage.HoodieHFileUtils.createHFileReader(HoodieHFileUtils.java:56)
           at org.apache.hudi.io.storage.HoodieHFileReader.<init>(HoodieHFileReader.java:95)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.newHFileFileReader(HoodieFileReaderFactory.java:57)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.getFileReader(HoodieFileReaderFactory.java:42)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getBaseFileReader(HoodieBackedTableMetadata.java:439)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReaders(HoodieBackedTableMetadata.java:413)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getOrCreateReaders(HoodieBackedTableMetadata.java:405)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$1(HoodieBackedTableMetadata.java:212)
           at java.util.HashMap.forEach(HashMap.java:1289)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:210)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:142)
           at org.apache.hudi.metadata.BaseTableMetadata.fetchAllFilesInPartition(BaseTableMetadata.java:323)
           at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:141)
           at org.apache.hudi.metadata.HoodieMetadataFileSystemView.listPartition(HoodieMetadataFileSystemView.java:65)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.lambda$ensurePartitionLoadedCorrectly$9(AbstractTableFileSystemView.java:306)
           at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.ensurePartitionLoadedCorrectly(AbstractTableFileSystemView.java:297)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.getLatestBaseFilesBeforeOrOn(AbstractTableFileSystemView.java:521)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:103)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFilesBeforeOrOn(PriorityBasedFileSystemView.java:144)
           at org.apache.hudi.index.HoodieIndexUtils.getLatestBaseFilesForPartition(HoodieIndexUtils.java:70)
           at org.apache.hudi.index.HoodieIndexUtils.lambda$getLatestBaseFilesForAllPartitions$ff6885d8$1(HoodieIndexUtils.java:110)
           at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:137)
           at org.apache.spark.api.java.JavaRDDLike.$anonfun$flatMap$1(JavaRDDLike.scala:125)
           at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
           at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
           at scala.collection.Iterator.foreach(Iterator.scala:941)
           at scala.collection.Iterator.foreach$(Iterator.scala:941)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
           at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
           at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
           at scala.collection.TraversableOnce.to(TraversableOnce.scala:315)
           at scala.collection.TraversableOnce.to$(TraversableOnce.scala:313)
           at scala.collection.AbstractIterator.to(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
           at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
           at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
           at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
           at scala.collection.AbstractIterator.toArray(Iterator.scala:1429)
           at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
           at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:131)
           at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.RuntimeException: Could not create  interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:75)
           at org.apache.hudi.org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:32)
           at org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:176)
           ... 52 more
   Caused by: java.util.NoSuchElementException
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:61)
           ... 54 more
   
   Driver stacktrace:
           at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2454)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2403)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2402)
           at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
           at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
           at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
           at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2402)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1160)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1160)
           at scala.Option.foreach(Option.scala:407)
           at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1160)
           at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2642)
           at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2584)
           at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2573)
           at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
           at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:938)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2214)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2235)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2254)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2279)
           at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
           at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
           at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
           at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
           at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
           at org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:362)
           at org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:361)
           at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
           at org.apache.hudi.client.common.HoodieSparkEngineContext.flatMap(HoodieSparkEngineContext.java:137)
           at org.apache.hudi.index.HoodieIndexUtils.getLatestBaseFilesForAllPartitions(HoodieIndexUtils.java:108)
           at org.apache.hudi.index.simple.HoodieSimpleIndex.fetchRecordLocationsForAffectedPartitions(HoodieSimpleIndex.java:144)
           at org.apache.hudi.index.simple.HoodieSimpleIndex.tagLocationInternal(HoodieSimpleIndex.java:113)
           at org.apache.hudi.index.simple.HoodieSimpleIndex.tagLocation(HoodieSimpleIndex.java:91)
           at org.apache.hudi.table.action.commit.HoodieDeleteHelper.execute(HoodieDeleteHelper.java:92)
           ... 83 more
   Caused by: java.lang.ExceptionInInitializerError
           at org.apache.hudi.io.storage.HoodieHFileUtils.createHFileReader(HoodieHFileUtils.java:56)
           at org.apache.hudi.io.storage.HoodieHFileReader.<init>(HoodieHFileReader.java:95)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.newHFileFileReader(HoodieFileReaderFactory.java:57)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.getFileReader(HoodieFileReaderFactory.java:42)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getBaseFileReader(HoodieBackedTableMetadata.java:439)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReaders(HoodieBackedTableMetadata.java:413)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getOrCreateReaders(HoodieBackedTableMetadata.java:405)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$1(HoodieBackedTableMetadata.java:212)
           at java.util.HashMap.forEach(HashMap.java:1289)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:210)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:142)
           at org.apache.hudi.metadata.BaseTableMetadata.fetchAllFilesInPartition(BaseTableMetadata.java:323)
           at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:141)
           at org.apache.hudi.metadata.HoodieMetadataFileSystemView.listPartition(HoodieMetadataFileSystemView.java:65)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.lambda$ensurePartitionLoadedCorrectly$9(AbstractTableFileSystemView.java:306)
           at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.ensurePartitionLoadedCorrectly(AbstractTableFileSystemView.java:297)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.getLatestBaseFilesBeforeOrOn(AbstractTableFileSystemView.java:521)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:103)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFilesBeforeOrOn(PriorityBasedFileSystemView.java:144)
           at org.apache.hudi.index.HoodieIndexUtils.getLatestBaseFilesForPartition(HoodieIndexUtils.java:70)
           at org.apache.hudi.index.HoodieIndexUtils.lambda$getLatestBaseFilesForAllPartitions$ff6885d8$1(HoodieIndexUtils.java:110)
           at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:137)
           at org.apache.spark.api.java.JavaRDDLike.$anonfun$flatMap$1(JavaRDDLike.scala:125)
           at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
           at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
           at scala.collection.Iterator.foreach(Iterator.scala:941)
           at scala.collection.Iterator.foreach$(Iterator.scala:941)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
           at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
           at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
           at scala.collection.TraversableOnce.to(TraversableOnce.scala:315)
           at scala.collection.TraversableOnce.to$(TraversableOnce.scala:313)
           at scala.collection.AbstractIterator.to(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
           at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
           at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
           at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
           at scala.collection.AbstractIterator.toArray(Iterator.scala:1429)
           at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
           at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:131)
           at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.RuntimeException: Could not create  interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:75)
           at org.apache.hudi.org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:32)
           at org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:176)
           ... 52 more
   Caused by: java.util.NoSuchElementException
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:61)
           ... 54 more
   
            ApplicationMaster host: tslp001441.oss.ads
            ApplicationMaster RPC port: 33760
            queue: pentaho
            start time: 1675876545632
            final status: FAILED
            tracking URL: https://tslp000994.oss.ads:8090/proxy/application_1675116425226_33144/
            user: pentaho
   23/02/08 17:18:35 ERROR yarn.Client: Application diagnostics message: User class threw exception: org.apache.hudi.exception.HoodieUpsertException: Failed to delete for commit time 20230208171759453
           at org.apache.hudi.table.action.commit.HoodieDeleteHelper.execute(HoodieDeleteHelper.java:113)
           at org.apache.hudi.table.action.commit.SparkDeleteCommitActionExecutor.execute(SparkDeleteCommitActionExecutor.java:45)
           at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.delete(HoodieSparkCopyOnWriteTable.java:130)
           at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.delete(HoodieSparkCopyOnWriteTable.java:97)
           at org.apache.hudi.client.SparkRDDWriteClient.delete(SparkRDDWriteClient.java:262)
           at org.apache.hudi.DataSourceUtils.doDeleteOperation(DataSourceUtils.java:218)
           at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:213)
           at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:144)
           at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
           at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
           at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
           at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:129)
           at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:856)
           at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
           at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
           at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:247)
           at org.apache.spark.sql.hudi.command.DeleteHoodieTableCommand.run(DeleteHoodieTableCommand.scala:48)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:111)
           at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
           at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
           at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
           at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
           at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
           at com.telus.argus.streaming.StreamingJob$.delayedEndpoint$com$telus$argus$streaming$StreamingJob$1(StreamingJob.scala:8)
           at com.telus.argus.streaming.StreamingJob$delayedInit$body.apply(StreamingJob.scala:6)
           at scala.Function0.apply$mcV$sp(Function0.scala:39)
           at scala.Function0.apply$mcV$sp$(Function0.scala:39)
           at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
           at scala.App.$anonfun$main$1$adapted(App.scala:80)
           at scala.collection.immutable.List.foreach(List.scala:392)
           at scala.App.main(App.scala:80)
           at scala.App.main$(App.scala:78)
           at com.telus.argus.streaming.StreamingJob$.main(StreamingJob.scala:6)
           at com.telus.argus.streaming.StreamingJob.main(StreamingJob.scala)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:737)
   Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 10.0 failed 4 times, most recent failure: Lost task 0.3 in stage 10.0 (TID 428) (tslp000992.oss.ads executor 7): java.lang.ExceptionInInitializerError
           at org.apache.hudi.io.storage.HoodieHFileUtils.createHFileReader(HoodieHFileUtils.java:56)
           at org.apache.hudi.io.storage.HoodieHFileReader.<init>(HoodieHFileReader.java:95)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.newHFileFileReader(HoodieFileReaderFactory.java:57)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.getFileReader(HoodieFileReaderFactory.java:42)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getBaseFileReader(HoodieBackedTableMetadata.java:439)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReaders(HoodieBackedTableMetadata.java:413)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getOrCreateReaders(HoodieBackedTableMetadata.java:405)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$1(HoodieBackedTableMetadata.java:212)
           at java.util.HashMap.forEach(HashMap.java:1289)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:210)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:142)
           at org.apache.hudi.metadata.BaseTableMetadata.fetchAllFilesInPartition(BaseTableMetadata.java:323)
           at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:141)
           at org.apache.hudi.metadata.HoodieMetadataFileSystemView.listPartition(HoodieMetadataFileSystemView.java:65)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.lambda$ensurePartitionLoadedCorrectly$9(AbstractTableFileSystemView.java:306)
           at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.ensurePartitionLoadedCorrectly(AbstractTableFileSystemView.java:297)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.getLatestBaseFilesBeforeOrOn(AbstractTableFileSystemView.java:521)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:103)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFilesBeforeOrOn(PriorityBasedFileSystemView.java:144)
           at org.apache.hudi.index.HoodieIndexUtils.getLatestBaseFilesForPartition(HoodieIndexUtils.java:70)
           at org.apache.hudi.index.HoodieIndexUtils.lambda$getLatestBaseFilesForAllPartitions$ff6885d8$1(HoodieIndexUtils.java:110)
           at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:137)
           at org.apache.spark.api.java.JavaRDDLike.$anonfun$flatMap$1(JavaRDDLike.scala:125)
           at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
           at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
           at scala.collection.Iterator.foreach(Iterator.scala:941)
           at scala.collection.Iterator.foreach$(Iterator.scala:941)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
           at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
           at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
           at scala.collection.TraversableOnce.to(TraversableOnce.scala:315)
           at scala.collection.TraversableOnce.to$(TraversableOnce.scala:313)
           at scala.collection.AbstractIterator.to(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
           at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
           at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
           at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
           at scala.collection.AbstractIterator.toArray(Iterator.scala:1429)
           at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
           at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:131)
           at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.RuntimeException: Could not create  interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:75)
           at org.apache.hudi.org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:32)
           at org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:176)
           ... 52 more
   Caused by: java.util.NoSuchElementException
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:61)
           ... 54 more
   
   Driver stacktrace:
           at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2454)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2403)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2402)
           at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
           at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
           at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
           at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2402)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1160)
           at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1160)
           at scala.Option.foreach(Option.scala:407)
           at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1160)
           at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2642)
           at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2584)
           at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2573)
           at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
           at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:938)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2214)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2235)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2254)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2279)
           at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
           at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
           at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
           at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
           at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
           at org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:362)
           at org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:361)
           at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
           at org.apache.hudi.client.common.HoodieSparkEngineContext.flatMap(HoodieSparkEngineContext.java:137)
           at org.apache.hudi.index.HoodieIndexUtils.getLatestBaseFilesForAllPartitions(HoodieIndexUtils.java:108)
           at org.apache.hudi.index.simple.HoodieSimpleIndex.fetchRecordLocationsForAffectedPartitions(HoodieSimpleIndex.java:144)
           at org.apache.hudi.index.simple.HoodieSimpleIndex.tagLocationInternal(HoodieSimpleIndex.java:113)
           at org.apache.hudi.index.simple.HoodieSimpleIndex.tagLocation(HoodieSimpleIndex.java:91)
           at org.apache.hudi.table.action.commit.HoodieDeleteHelper.execute(HoodieDeleteHelper.java:92)
           ... 83 more
   Caused by: java.lang.ExceptionInInitializerError
           at org.apache.hudi.io.storage.HoodieHFileUtils.createHFileReader(HoodieHFileUtils.java:56)
           at org.apache.hudi.io.storage.HoodieHFileReader.<init>(HoodieHFileReader.java:95)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.newHFileFileReader(HoodieFileReaderFactory.java:57)
           at org.apache.hudi.io.storage.HoodieFileReaderFactory.getFileReader(HoodieFileReaderFactory.java:42)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getBaseFileReader(HoodieBackedTableMetadata.java:439)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReaders(HoodieBackedTableMetadata.java:413)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getOrCreateReaders(HoodieBackedTableMetadata.java:405)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$1(HoodieBackedTableMetadata.java:212)
           at java.util.HashMap.forEach(HashMap.java:1289)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:210)
           at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:142)
           at org.apache.hudi.metadata.BaseTableMetadata.fetchAllFilesInPartition(BaseTableMetadata.java:323)
           at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:141)
           at org.apache.hudi.metadata.HoodieMetadataFileSystemView.listPartition(HoodieMetadataFileSystemView.java:65)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.lambda$ensurePartitionLoadedCorrectly$9(AbstractTableFileSystemView.java:306)
           at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.ensurePartitionLoadedCorrectly(AbstractTableFileSystemView.java:297)
           at org.apache.hudi.common.table.view.AbstractTableFileSystemView.getLatestBaseFilesBeforeOrOn(AbstractTableFileSystemView.java:521)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:103)
           at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFilesBeforeOrOn(PriorityBasedFileSystemView.java:144)
           at org.apache.hudi.index.HoodieIndexUtils.getLatestBaseFilesForPartition(HoodieIndexUtils.java:70)
           at org.apache.hudi.index.HoodieIndexUtils.lambda$getLatestBaseFilesForAllPartitions$ff6885d8$1(HoodieIndexUtils.java:110)
           at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:137)
           at org.apache.spark.api.java.JavaRDDLike.$anonfun$flatMap$1(JavaRDDLike.scala:125)
           at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
           at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
           at scala.collection.Iterator.foreach(Iterator.scala:941)
           at scala.collection.Iterator.foreach$(Iterator.scala:941)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
           at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
           at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
           at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
           at scala.collection.TraversableOnce.to(TraversableOnce.scala:315)
           at scala.collection.TraversableOnce.to$(TraversableOnce.scala:313)
           at scala.collection.AbstractIterator.to(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
           at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
           at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1429)
           at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
           at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
           at scala.collection.AbstractIterator.toArray(Iterator.scala:1429)
           at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
           at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.run(Task.scala:131)
           at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
           at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
           at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.RuntimeException: Could not create  interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:75)
           at org.apache.hudi.org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:32)
           at org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:176)
           ... 52 more
   Caused by: java.util.NoSuchElementException
           at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:61)
           ... 54 more
   
   Exception in thread "main" org.apache.spark.SparkException: Application application_1675116425226_33144 finished with failed status
           at org.apache.spark.deploy.yarn.Client.run(Client.scala:1336)
           at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1733)
           at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1025)
           at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:181)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:204)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
           at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1113)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1122)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   23/02/08 17:18:35 INFO util.ShutdownHookManager: Shutdown hook called
   ```
   
   ```
   java.lang.ExceptionInInitializerError
   	at org.apache.hudi.io.storage.HoodieHFileUtils.createHFileReader(HoodieHFileUtils.java:56)
   	at org.apache.hudi.io.storage.HoodieHFileReader.<init>(HoodieHFileReader.java:95)
   	at org.apache.hudi.io.storage.HoodieFileReaderFactory.newHFileFileReader(HoodieFileReaderFactory.java:57)
   	at org.apache.hudi.io.storage.HoodieFileReaderFactory.getFileReader(HoodieFileReaderFactory.java:42)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getBaseFileReader(HoodieBackedTableMetadata.java:439)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReaders(HoodieBackedTableMetadata.java:413)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getOrCreateReaders$11(HoodieBackedTableMetadata.java:403)
   	at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getOrCreateReaders(HoodieBackedTableMetadata.java:402)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$1(HoodieBackedTableMetadata.java:212)
   	at java.util.HashMap.forEach(HashMap.java:1289)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:210)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:142)
   	at org.apache.hudi.metadata.BaseTableMetadata.fetchAllFilesInPartition(BaseTableMetadata.java:323)
   	at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:141)
   	at org.apache.hudi.metadata.HoodieMetadataFileSystemView.listPartition(HoodieMetadataFileSystemView.java:65)
   	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.lambda$ensurePartitionLoadedCorrectly$9(AbstractTableFileSystemView.java:306)
   	at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
   	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.ensurePartitionLoadedCorrectly(AbstractTableFileSystemView.java:297)
   	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.getLatestBaseFilesBeforeOrOn(AbstractTableFileSystemView.java:521)
   	at org.apache.hudi.timeline.service.handlers.BaseFileHandler.getLatestDataFilesBeforeOrOn(BaseFileHandler.java:60)
   	at org.apache.hudi.timeline.service.RequestHandler.lambda$registerDataFilesAPI$6(RequestHandler.java:271)
   	at org.apache.hudi.timeline.service.RequestHandler$ViewHandler.handle(RequestHandler.java:500)
   	at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22)
   	at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606)
   	at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46)
   	at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17)
   	at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143)
   	at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41)
   	at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107)
   	at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72)
   	at org.apache.hudi.org.apache.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
   	at org.apache.hudi.org.apache.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
   	at org.apache.hudi.org.apache.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668)
   	at org.apache.hudi.org.apache.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
   	at org.apache.hudi.org.apache.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
   	at org.apache.hudi.org.apache.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
   	at org.apache.hudi.org.apache.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
   	at org.apache.hudi.org.apache.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174)
   	at org.apache.hudi.org.apache.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
   	at org.apache.hudi.org.apache.jetty.server.Server.handle(Server.java:502)
   	at org.apache.hudi.org.apache.jetty.server.HttpChannel.handle(HttpChannel.java:370)
   	at org.apache.hudi.org.apache.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
   	at org.apache.hudi.org.apache.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
   	at org.apache.hudi.org.apache.jetty.io.FillInterest.fillable(FillInterest.java:103)
   	at org.apache.hudi.org.apache.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
   	at org.apache.hudi.org.apache.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
   	at org.apache.hudi.org.apache.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
   	at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.RuntimeException: Could not create  interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
   	at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:75)
   	at org.apache.hudi.org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:32)
   	at org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:176)
   	... 49 more
   Caused by: java.util.NoSuchElementException
   	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
   	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
   	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
   	at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:61)
   	... 51 more
   23/02/08 16:13:29 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 10.0 (TID 425) (tslp001449.oss.ads executor 4): java.lang.ExceptionInInitializerError
   	at org.apache.hudi.io.storage.HoodieHFileUtils.createHFileReader(HoodieHFileUtils.java:56)
   	at org.apache.hudi.io.storage.HoodieHFileReader.<init>(HoodieHFileReader.java:95)
   	at org.apache.hudi.io.storage.HoodieFileReaderFactory.newHFileFileReader(HoodieFileReaderFactory.java:57)
   	at org.apache.hudi.io.storage.HoodieFileReaderFactory.getFileReader(HoodieFileReaderFactory.java:42)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getBaseFileReader(HoodieBackedTableMetadata.java:439)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReaders(HoodieBackedTableMetadata.java:413)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getOrCreateReaders(HoodieBackedTableMetadata.java:405)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$1(HoodieBackedTableMetadata.java:212)
   	at java.util.HashMap.forEach(HashMap.java:1289)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:210)
   	at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:142)
   	at org.apache.hudi.metadata.BaseTableMetadata.fetchAllFilesInPartition(BaseTableMetadata.java:323)
   	at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:141)
   	at org.apache.hudi.metadata.HoodieMetadataFileSystemView.listPartition(HoodieMetadataFileSystemView.java:65)
   	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.lambda$ensurePartitionLoadedCorrectly$9(AbstractTableFileSystemView.java:306)
   	at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
   	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.ensurePartitionLoadedCorrectly(AbstractTableFileSystemView.java:297)
   	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.getLatestBaseFilesBeforeOrOn(AbstractTableFileSystemView.java:521)
   	at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.execute(PriorityBasedFileSystemView.java:103)
   	at org.apache.hudi.common.table.view.PriorityBasedFileSystemView.getLatestBaseFilesBeforeOrOn(PriorityBasedFileSystemView.java:144)
   	at org.apache.hudi.index.HoodieIndexUtils.getLatestBaseFilesForPartition(HoodieIndexUtils.java:70)
   	at org.apache.hudi.index.HoodieIndexUtils.lambda$getLatestBaseFilesForAllPartitions$ff6885d8$1(HoodieIndexUtils.java:110)
   	at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:137)
   	at org.apache.spark.api.java.JavaRDDLike.$anonfun$flatMap$1(JavaRDDLike.scala:125)
   	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
   	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
   	at scala.collection.Iterator.foreach(Iterator.scala:941)
   	at scala.collection.Iterator.foreach$(Iterator.scala:941)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
   	at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
   	at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
   	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
   	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
   	at scala.collection.TraversableOnce.to(TraversableOnce.scala:315)
   	at scala.collection.TraversableOnce.to$(TraversableOnce.scala:313)
   	at scala.collection.AbstractIterator.to(Iterator.scala:1429)
   	at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
   	at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
   	at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1429)
   	at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
   	at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
   	at scala.collection.AbstractIterator.toArray(Iterator.scala:1429)
   	at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
   	at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
   	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
   	at org.apache.spark.scheduler.Task.run(Task.scala:131)
   	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
   	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.RuntimeException: Could not create  interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
   	at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:75)
   	at org.apache.hudi.org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:32)
   	at org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:176)
   	... 52 more
   Caused by: java.util.NoSuchElementException
   	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
   	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
   	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
   	at org.apache.hudi.org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:61)
   	... 54 more
   
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] stantaov commented on issue #7899: [SUPPORT] Error "Could not create interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?" While Deleting Data

Posted by "stantaov (via GitHub)" <gi...@apache.org>.
stantaov commented on issue #7899:
URL: https://github.com/apache/hudi/issues/7899#issuecomment-1438644374

   any help will be appreciated 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] stantaov commented on issue #7899: [SUPPORT] Error "Could not create interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?" While Deleting Data

Posted by "stantaov (via GitHub)" <gi...@apache.org>.
stantaov commented on issue #7899:
URL: https://github.com/apache/hudi/issues/7899#issuecomment-1430147860

   Hi there, 
   
   there is spark-submit command I use 
   
   ` spark3-submit --master yarn --deploy-mode client --queue XXXX --executor-memory 18G --driver-memory 16G --num-executors 4 --executor-cores 2 --conf spark.yarn.max.executor.failures=10 --conf spark.sql.shuffle.partitions=100 --conf spark.rdd.compress=true --name XXXXXX --conf spark.scheduler.mode=FAIR --conf spark.executor.memoryOverhead=4096 --conf spark.driver.memoryOverhead=2048 --class com.XXXXXX  /tmp/XXXXXXX-0.1.jar --jars /tmp/XXXXXX-0.1.jar  --hoodie-conf hoodie.clustering.async.enabled=true`
   
   
   `java.class.path=/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/conf/:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/accessors-smart-2.3.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/activation-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/aggdesigner-algorithm-6.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/aircompressor-0.21.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/algebra_2.12-2.0.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/animal-sniffer-annotations-1.17.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/antlr4-runtime-4.8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/antlr-runtime-3.5.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.717
 1000.1-1-1.p0.25570994/lib/spark3/jars/aopalliance-1.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/aopalliance-repackaged-2.6.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/arpack-2.2.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/arpack_combined_all-0.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/arrow-format-2.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/arrow-memory-core-2.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/arrow-memory-netty-2.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/arrow-vector-2.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/audience-annotations-0.5.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/ja
 rs/automaton-1.11-8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/avatica-core-1.16.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/avatica-metrics-1.16.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/avro-1.8.2.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/avro-ipc-1.8.2.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/avro-mapred-1.8.2.7.1.7.1000-141-hadoop2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/aws-java-sdk-bundle-1.11.1026.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/azure-keyvault-core-1.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.
 3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/azure-storage-7.0.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/blas-2.2.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/bonecp-0.8.0.RELEASE.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/breeze_2.12-1.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/breeze-macros_2.12-1.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/calcite-core-1.19.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/calcite-linq4j-1.19.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/cats-kernel_2.12-2.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/checker-qual-2.8.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.255709
 94/lib/spark3/jars/chill_2.12-0.10.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/chill-java-0.10.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-cli-1.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-codec-1.15.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-collections-3.2.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-compiler-3.0.16.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-compress-1.21.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-configuration-1.10.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-configurati
 on2-2.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-crypto-1.1.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-dbcp-1.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-dbcp2-2.5.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-io-2.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-lang-2.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-lang3-3.12.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-logging-1.1.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-math3-3.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171
 000.1-1-1.p0.25570994/lib/spark3/jars/commons-net-3.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-pool-1.5.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-pool2-2.6.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/commons-text-1.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/compress-lzf-1.0.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/core-1.1.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/cron-utils-9.1.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/curator-client-4.3.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/curator-framework-4.3.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/
 curator-recipes-4.3.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/datanucleus-api-jdo-4.2.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/datanucleus-core-4.1.17.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/datanucleus-rdbms-4.1.19.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/derby-10.14.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/dnsjava-2.1.7.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/ehcache-3.3.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/error_prone_annotations-2.3.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/j
 ars/esri-geometry-api-2.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/failureaccess-1.0.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/flatbuffers-java-1.9.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/forbiddenapis-2.7.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/gateway-cloud-bindings-1.3.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/gateway-i18n-1.3.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/gateway-shell-1.3.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/gateway-util-common-1.3.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/gcs-connector-2.1.2.7.1.7.1000-141-shaded.jar:/opt/cloudera/parcels/SPARK3
 -3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/generex-1.0.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/gson-2.2.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/guava-28.0-jre.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/guice-4.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/guice-servlet-4.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-annotations-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-auth-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-aws-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1
 -1.p0.25570994/lib/spark3/jars/hadoop-azure-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-azure-datalake-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/HikariCP-2.6.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hk2-api-2.6.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-client-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-cloud-storage-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-common-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-hdfs-client-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-mapreduce-client-comm
 on-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-mapreduce-client-core-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-mapreduce-client-jobclient-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-openstack-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-yarn-api-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-yarn-client-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-yarn-common-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-yarn-registry-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spa
 rk3/jars/hadoop-yarn-server-common-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hadoop-yarn-server-web-proxy-3.1.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-classification-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-common-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-exec-3.1.3000.7.1.7.1000-141-core.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-llap-client-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-llap-common-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-metastore-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1
 .p0.25570994/lib/spark3/jars/hive-serde-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-shims-0.23-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-shims-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-shims-common-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-shims-scheduler-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-standalone-metastore-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-storage-api-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hive-vector-code-gen-3.1.3000.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3
 .2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hk2-locator-2.6.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hk2-utils-2.6.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/hppc-0.7.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/httpclient-4.5.13.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/httpcore-4.4.13.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/istack-commons-runtime-3.0.8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/ivy-2.5.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/j2objc-annotations-1.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-annotations-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-core-2.10.5
 .jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-core-asl-1.9.13.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-databind-2.10.5.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-dataformat-cbor-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-dataformat-yaml-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-datatype-jsr310-2.11.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-jaxrs-base-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-jaxrs-json-provider-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-mapper-asl-1.9.13-cloudera.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/sp
 ark3/jars/jackson-module-jaxb-annotations-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-module-paranamer-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jackson-module-scala_2.12-2.10.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jakarta.annotation-api-1.3.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jakarta.inject-2.6.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jakarta.servlet-api-4.0.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jakarta.validation-api-2.0.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jakarta.ws.rs-api-2.1.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.71
 71000.1-1-1.p0.25570994/lib/spark3/jars/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/janino-3.0.16.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/JavaEWAH-0.3.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javassist-3.25.0-GA.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javax.activation-1.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javax.annotation-api-1.3.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javax.el-3.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javax.inject-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javax.jdo
 -3.2.0-m3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javax.servlet-api-3.1.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/javolution-5.5.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jaxb-runtime-2.3.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jcl-over-slf4j-1.7.30.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jersey-client-2.34.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jersey-common-2.34.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jersey-container-servlet-2.34.jar:/opt/cloudera/parcels/SPARK3-
 3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jersey-container-servlet-core-2.34.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jersey-hk2-2.34.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jersey-server-2.34.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jetty-rewrite-9.4.40.v20210413.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jetty-util-9.4.40.v20210413.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jetty-util-ajax-9.4.40.v20210413.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/JLargeArrays-1.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jline-2.14.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/joda-time-2.9.9.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.
 p0.25570994/lib/spark3/jars/jodd-util-6.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jpam-1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/json-1.8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/json4s-ast_2.12-3.7.0-M11.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/json4s-core_2.12-3.7.0-M11.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/json4s-jackson_2.12-3.7.0-M11.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/json4s-scalap_2.12-3.7.0-M11.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/json-path-2.4.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/json-smart-2.3.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jsp-api-2.1.jar:/opt/clou
 dera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jsr305-3.0.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/JTransforms-3.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/jul-to-slf4j-1.7.30.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kafka-clients-2.5.0.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-admin-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-client-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-common-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-core-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-crypto-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/s
 park3/jars/kerb-identity-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-server-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-simplekdc-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerb-util-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerby-asn1-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerby-config-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerby-pkix-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerby-util-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kerby-xdr-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kryo-shaded-4.0.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2
 .7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-client-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-admissionregistration-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-apiextensions-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-apps-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-autoscaling-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-batch-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-certificates-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-common-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes
 -model-coordination-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-core-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-discovery-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-events-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-extensions-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-flowcontrol-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-metrics-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-networking-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-node-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1
 .3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-policy-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-rbac-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-scheduling-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/kubernetes-model-storageclass-5.4.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/lapack-2.2.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/libfb303-0.9.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/libthrift-0.13.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/p
 arcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/log4j-1.2.17-cloudera1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/logging-interceptor-3.12.12.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/logredactor-2.0.13.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/lz4-java-1.7.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/macro-compat_2.12-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/memory-0.9.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/metrics-core-4.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/metrics-graphite-4.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/metrics-jmx-4.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/
 lib/spark3/jars/metrics-json-4.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/metrics-jvm-4.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/minlog-1.3.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/netty-all-4.1.63.Final.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/nimbus-jose-jwt-7.9.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/objenesis-2.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/okhttp-2.7.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/okhttp-3.12.12.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/okio-1.14.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/opencsv-2.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.
 p0.25570994/lib/spark3/jars/orc-core-1.5.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/orc-mapreduce-1.5.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/orc-shims-1.5.1.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/oro-2.0.8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/osgi-resource-locator-1.0.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/paranamer-2.8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/parquet-column-1.10.99.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/parquet-common-1.10.99.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/parquet-encoding-1.10.99.7.1.7.1000-141.jar:/opt/cloudera/parcels
 /SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/parquet-format-structures-1.10.99.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/parquet-hadoop-1.10.99.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/parquet-jackson-1.10.99.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/py4j-0.10.9.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/pyrolite-4.30.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/re2j-1.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/reflections-0.9.10.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/RoaringBitmap-0.9.0.jar:/opt/cloudera/parcels/SPARK3-3.
 2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/rocksdbjni-6.20.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/scala-collection-compat_2.12-2.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/scala-compiler-2.12.10.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/scala-library-2.12.10.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/scala-parser-combinators_2.12-1.1.2.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/scala-reflect-2.12.10.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/scala-xml_2.12-1.2.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/shapeless_2.12-2.3.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/shims-0.9.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.
 p0.25570994/lib/spark3/jars/sketches-core-0.9.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/slf4j-api-1.7.30.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/slf4j-log4j12-1.7.30.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/snakeyaml-1.26.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/snappy-java-1.1.8.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-avro_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-catalyst_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-core_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-graphx_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1
 .3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-hadoop-cloud_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-hive_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-kubernetes_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-kvstore_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-launcher_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-mllib_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-mllib-local_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-network-common_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/clo
 udera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-network-shuffle_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-repl_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-sketch_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-sql_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-sql-kafka-0-10_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-streaming_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-streaming-kafka-0-10_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-tags_2.12-3.2.
 1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-tags_2.12-3.2.1.3.2.7171000.1-1-tests.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-token-provider-kafka-0-10_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-unsafe_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spark-yarn_2.12-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spire_2.12-0.17.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spire-macros_2.12-0.17.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spire-platform_2.12-0.17.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/spire-util_2.12-0.17.0.jar:/opt/cloudera/parcels
 /SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/sqlline-1.3.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/ST4-4.0.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/stax2-api-3.1.4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/stax-api-1.0.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/stream-2.9.6.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/threeten-extra-1.5.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/tink-1.6.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/token-provider-1.1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/transaction-api-1.1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/univocity-parsers-2.9.1.jar:/op
 t/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/velocity-1.5.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/velocity-engine-core-2.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/wildfly-openssl-1.0.7.Final.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/woodstox-core-5.0.3.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/xbean-asm9-shaded-4.20.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/xz-1.8.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/zjsonpatch-0.3.0.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/zookeeper-3.5.5.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/zookeeper-jute-3.5.5.7.1.7.1000-141.jar:/opt/cloudera/parcels/SPARK3-3.2.1.
 3.2.7171000.1-1-1.p0.25570994/lib/spark3/jars/zstd-jni-1.5.0-4.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/conf/yarn-conf/:/etc/hive/conf/:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/accessors-smart-1.2.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/accessors-smart.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/animal-sniffer-annotations-1.18.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/animal-sniffer-annotations.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/asm-5.0.4.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/asm.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/avro.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/aws-java-sdk-bundle-1.11.901.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hado
 op/client/aws-java-sdk-bundle.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/azure-data-lake-store-sdk-2.3.6.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/azure-data-lake-store-sdk.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/checker-qual-2.8.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/checker-qual.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-beanutils-1.9.4.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-beanutils.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-cli.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-codec-1.14.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-codec.jar:
 /opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-collections.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-compress-1.19.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-compress.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-configuration2-2.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-configuration2.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-io-2.6.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-io.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-lang.jar:/opt/cloudera/parcels/CDH-7.1.7-1.
 cdh7.1.7.p0.15945976/lib/hadoop/client/commons-lang3-3.8.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-lang3.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-logging.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-math3.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-net-3.6.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/commons-net.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/curator-client-4.3.0.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/curator-client.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/curato
 r-framework-4.3.0.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/curator-framework.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/curator-recipes-4.3.0.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/curator-recipes.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/failureaccess-1.0.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/failureaccess.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/gson-2.2.4.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/gson.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/guava-28.1-jre.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/guava.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-annotations-3.1.1.7.1.7.0-551.jar:/opt/cloudera/par
 cels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-annotations.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-auth-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-auth.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-aws-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-aws.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-azure-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-azure-datalake-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-azure.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-common-3.1.1.7.1.7.0-551.jar:/opt/
 cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-common.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-hdfs-client-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-hdfs-client.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-mapreduce-client-common-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-mapreduce-client-core-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-mapreduce-client-jobclient-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-mapreduce-client-jobclient.j
 ar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-yarn-api-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-yarn-client-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-yarn-common-3.1.1.7.1.7.0-551.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/httpclient-4.5.13.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/httpclient.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/httpcore-4.4.13.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/httpcore.jar:/opt/clo
 udera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/j2objc-annotations-1.3.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/j2objc-annotations.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jakarta.activation-api-1.2.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jakarta.activation-api.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jakarta.xml.bind-api-2.3.2.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jakarta.xml.bind-api.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/javax.activation-api-1.2.0.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/javax.activation-api.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jaxb-api-2.2.11.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jaxb-api.jar:/opt/cloudera/parcel
 s/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jcip-annotations-1.0-1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jcip-annotations.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/json-smart-2.3.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/json-smart.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jsp-api.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jsr305.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jsr311-api-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/jsr311-api.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-admin-1.1.1.jar:/opt/cloudera/parce
 ls/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-admin.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-client-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-client.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-common-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-common.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-core-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-core.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-crypto-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-crypto.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-identity-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-identity.jar:/opt/clo
 udera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-server-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-server.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-simplekdc-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-simplekdc.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-util-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerb-util.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-asn1-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-asn1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-config-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-config.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-pkix
 -1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-pkix.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-util-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-util.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-xdr-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/kerby-xdr.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/listenablefuture.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/log4j-1.2.17-cloudera1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/log4j.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/nimbus-jose-jwt-7.9.jar:/opt/cloudera/parcels/CDH-
 7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/nimbus-jose-jwt.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/okhttp-2.7.5.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/okhttp.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/okio-1.6.0.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/okio.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/paranamer-2.8.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/paranamer.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/protobuf-java.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/re2j-1.2.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/re2j.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.159459
 76/lib/hadoop/client/snappy-java-1.1.7.7.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/snappy-java.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/stax2-api-3.1.4.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/stax2-api.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/token-provider-1.1.1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/token-provider.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/wildfly-openssl-1.0.7.Final.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/wildfly-openssl.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/woodstox-core-5.0.3.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/woodstox-core.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/xz-1.8.jar:/opt/cloudera/parcels/C
 DH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hadoop/client/xz.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hbase/bin/../lib/client-facing-thirdparty/audience-annotations-0.5.0.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hbase/bin/../lib/client-facing-thirdparty/commons-logging-1.2.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hbase/bin/../lib/client-facing-thirdparty/findbugs-annotations-1.3.9-1.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hbase/bin/../lib/client-facing-thirdparty/htrace-core4-4.2.0-incubating.jar:/opt/cloudera/parcels/CDH-7.1.7-1.cdh7.1.7.p0.15945976/lib/hbase/bin/../lib/shaded-clients/hbase-shaded-mapreduce-2.2.3.7.1.7.0-551.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/atlas-connector/spark-atlas-connector-assembly-3.2.1.3.2.7171000.1-1.jar:/opt/cloudera/parcels/SPARK3-3.2.1.3.2.7171000.1-1-1.p0.25570994/lib/spark3/atlas-connector/spark-atlas-connector_2.12-3.2.1.3.2.
 7171000.1-1.jar`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] stantaov commented on issue #7899: [SUPPORT] Error "Could not create interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?" While Deleting Data

Posted by "stantaov (via GitHub)" <gi...@apache.org>.
stantaov commented on issue #7899:
URL: https://github.com/apache/hudi/issues/7899#issuecomment-1431925449

   .. and these binaries I am compiling 
     "org.apache.spark" %% "spark-sql" % 3.2.1,
     "com.typesafe" % "config" % "1.4.0",
     "org.slf4j" % "slf4j-simple" % "1.7.30",
     "org.apache.hudi" %% "hudi-spark3.2-bundle" % "0.12.2"
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] xushiyan commented on issue #7899: [SUPPORT] Error "Could not create interface org.apache.hudi.org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?" While Deleting Data

Posted by "xushiyan (via GitHub)" <gi...@apache.org>.
xushiyan commented on issue #7899:
URL: https://github.com/apache/hudi/issues/7899#issuecomment-1428636069

   for classpath dependency conflicts, can you share the exact jars passed to the spark job? like what is in `--jars /tmp/XXXXXX.jar ` ? and what jars are on classpath. you can enable printing classpath for the spark job at the start and inspect it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org