You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Raymond Xu (Jira)" <ji...@apache.org> on 2022/04/25 12:57:00 UTC

[jira] [Assigned] (HUDI-1061) Hudi CLI savepoint command fail because of spark conf loading issue

     [ https://issues.apache.org/jira/browse/HUDI-1061?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Raymond Xu reassigned HUDI-1061:
--------------------------------

    Assignee: sivabalan narayanan

> Hudi CLI savepoint command fail because of spark conf loading issue
> -------------------------------------------------------------------
>
>                 Key: HUDI-1061
>                 URL: https://issues.apache.org/jira/browse/HUDI-1061
>             Project: Apache Hudi
>          Issue Type: Bug
>          Components: cli
>            Reporter: Wenning Ding
>            Assignee: sivabalan narayanan
>            Priority: Major
>
> h3. Reproduce
> open hudi-cli.sh and run these two commands:
> {code:java}
> connect --path s3://wenningd-emr-dev/hudi/tables/events/hudi_null01 savepoint create --commit 20191122225109 
> {code}
> {{}}
> {{}}You would see this error:
> {{}}
> {code:java}
> java.io.FileNotFoundException: File file:/tmp/spark-events does not exist at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:640) at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:866) at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:630) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:452) at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:97) at org.apache.spark.SparkContext.<init>(SparkContext.scala:523) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) at org.apache.hudi.cli.utils.SparkUtil.initJavaSparkConf(SparkUtil.java:85) at org.apache.hudi.cli.commands.SavepointsCommand.savepoint(SavepointsCommand.java:79) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.springframework.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:216) at org.springframework.shell.core.SimpleExecutionStrategy.invoke(SimpleExecutionStrategy.java:68) at org.springframework.shell.core.SimpleExecutionStrategy.execute(SimpleExecutionStrategy.java:59) at org.springframework.shell.core.AbstractShell.executeCommand(AbstractShell.java:134) at org.springframework.shell.core.JLineShell.promptLoop(JLineShell.java:533) at org.springframework.shell.core.JLineShell.run(JLineShell.java:179) at java.lang.Thread.run(Thread.java:748){code}
> {{}}Although in {{spark-defaults.conf}}, it configs {{spark.eventLog.dir               hdfs:///var/log/spark/apps}}, but here hudi cli still uses {{file:/tmp/spark-events}} as the event log dir, which means sparkcontext won't load the configs from {{spark-defaults.conf}}.
> We should make initJavaSparkConf method be able to read configs from spark config file.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)