You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Tallanel Siriel <ta...@gmail.com> on 2020/08/14 12:08:36 UTC

hdfs cli launches a java.io.FileNotFoundException related to hadoop-mapreduce.jobsummary.log before showing the result

Hello.

I send you this mail because I migrated recently from HDP 2.6.2 to HDP
3.1.4 and I encounter a strange error when I use hdfs CLI.

I perform the following command :
hdfs dfs -ls /

And I got the following error before obtaining the result :
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException:
/<HDFS_LOG_FOLDER>/<MYUSER>/hadoop-mapreduce.jobsummary.log (No such file
or directory)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
        at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
        at
org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
        at
org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
        at
org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
        at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
        at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
        at
org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
        at
org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
        at
org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
        at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
        at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
        at
org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
        at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
        at
org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
        at
org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
        at
org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
        at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
        at
org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
        at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
        at org.apache.hadoop.fs.FsShell.<clinit>(FsShell.java:48)

<HDFS_LOG_FOLDER> is the folder used to store the log HDFS. (defined in the
property hdfs_log_dir_prefix).
<MYUSER> is my local user on the server when I launch the hdfs cli.

After this exception, it shows the result of the ls normally.

I was wondering where this problem could come from ?

I didn't have this error when I was using HDP 2.6.2, so I'm wondering what
is happening.

Thank you in advance for your help.

Tallanel

Re: hdfs cli launches a java.io.FileNotFoundException related to hadoop-mapreduce.jobsummary.log before showing the result

Posted by Tallanel Siriel <ta...@gmail.com>.
Hello.

I had the same problem for every hdfs subcommand.
So I investigate to see the origin of the problem.

which hdfs
indicated methat hdfs command was in /usr/bin.

/usr/bin/hdfs calls another script :
/usr/hdp/3.1.4.0-315//hadoop-hdfs/bin/hdfs.distro "$@"

hdfs.distro calls a function at the end :
hadoop_generic_java_subcmd_handler

This function is defined in
/usr/hdp/current/hadoop-client/libexec/hadoop-functions.sh.
This function calls another function called hadoop_finalize.
hadoop_finalize calls another function called hadoop_shellprofiles_finalize.
hadoop_shellprofiles_finalize iterates on the different shell profiles
which are hdfs, mapreduce and yarn.
For each one of them, it calls a function called _<hdfs or mapreduce or
yarn>_hadoop_finalize.
After investigation, the function concerning my problem was
_yarn_hadoop_finalize.
This function is contained in the following file :
/usr/hdp/current/hadoop-client/libexec/shellprofile.d/hadoop-yarn.sh.

_yarn_hadoop_finalize add options to HADOOP_OPTS environment variable.

I performed some tests and the problem was coming from this one :
hadoop_add_param HADOOP_OPTS yarn.log.dir "-Dyarn.log.dir=${yld}"
with yld a local variable set to $HADOOP_LOG_DIR.

Knowing that, I just had to modify the hadoop-env part in ambari in the
HDFS configuration.
I added the following line at the end :
export HADOOP_CLIENT_OPTS="-Dyarn.log.dir=. $HADOOP_CLIENT_OPTS"

With this line, I think the impact is minimal because it concerns only the
client.
This file hadoop-mapreduce.jobsummary.log does not seem that big so I guess
the current folder is OK.

If anyone has a better understanding of the situation, I would gladly
listen.
Why does hdfs cli commands set yarn properties ?

Best regards.

Tallanel

Le ven. 14 août 2020 à 14:08, Tallanel Siriel <ta...@gmail.com> a
écrit :

> Hello.
>
> I send you this mail because I migrated recently from HDP 2.6.2 to HDP
> 3.1.4 and I encounter a strange error when I use hdfs CLI.
>
> I perform the following command :
> hdfs dfs -ls /
>
> And I got the following error before obtaining the result :
> log4j:ERROR setFile(null,true) call failed.
> java.io.FileNotFoundException:
> /<HDFS_LOG_FOLDER>/<MYUSER>/hadoop-mapreduce.jobsummary.log (No such file
> or directory)
>         at java.io.FileOutputStream.open0(Native Method)
>         at java.io.FileOutputStream.open(FileOutputStream.java:270)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
>         at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
>         at
> org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
>         at
> org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
>         at
> org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
>         at
> org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
>         at
> org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
>         at
> org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
>         at
> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
>         at
> org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
>         at
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
>         at
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
>         at
> org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
>         at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
>         at
> org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
>         at
> org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
>         at
> org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
>         at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
>         at
> org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
>         at
> org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
>         at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
>         at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
>         at org.apache.hadoop.fs.FsShell.<clinit>(FsShell.java:48)
>
> <HDFS_LOG_FOLDER> is the folder used to store the log HDFS. (defined in
> the property hdfs_log_dir_prefix).
> <MYUSER> is my local user on the server when I launch the hdfs cli.
>
> After this exception, it shows the result of the ls normally.
>
> I was wondering where this problem could come from ?
>
> I didn't have this error when I was using HDP 2.6.2, so I'm wondering what
> is happening.
>
> Thank you in advance for your help.
>
> Tallanel
>