You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by xeonmailinglist <xe...@gmail.com> on 2015/07/07 11:33:54 UTC

get the hadoop working dir using the bash commands?

Is it possible to get the hadoop working dir using the bash commands?

Re: get the hadoop working dir using the bash commands?

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello,

Are you looking for the working directory used by HDFS to resolve relative
paths used in commands like "hdfs dfs -ls myRelativePath"?  If so, then
the working directory is the current user's home directory.  HDFS defines
the home directory as a common prefix followed by the username.  The
prefix is controlled by configuration property dfs.user.home.dir.prefix,
and the default is /user.

<property>
  <name>dfs.user.home.dir.prefix</name>
  <value>/user</value>
  <description>The directory to prepend to user name to get the user's
  home direcotry.
  </description>
</property>

You mentioned wanting to access this in bash.  I think you can combine
"hdfs getconf" with the current user to get it.  For example:


> echo "$(hdfs getconf -confKey dfs.user.home.dir.prefix)/$USER"
/user/chris

If you're running a secured cluster with complex auth-to-local name
conversion rules, then simply using $USER might not be sufficient.  If you
find that's the case, then look at using the "hadoop kerbname <your
Kerberos principal>" command.  The "hadoop kerbname" command alias only
exists in trunk right now.  For 2.x builds, you can still get the same
effect by running the underlying class directly: "hadoop
org.apache.hadoop.security.HadoopKerberosName <your Kerberos principal>".


I hope this helps.

--Chris Nauroth




On 7/7/15, 2:33 AM, "xeonmailinglist" <xe...@gmail.com> wrote:

>Is it possible to get the hadoop working dir using the bash commands?


Re: get the hadoop working dir using the bash commands?

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello,

Are you looking for the working directory used by HDFS to resolve relative
paths used in commands like "hdfs dfs -ls myRelativePath"?  If so, then
the working directory is the current user's home directory.  HDFS defines
the home directory as a common prefix followed by the username.  The
prefix is controlled by configuration property dfs.user.home.dir.prefix,
and the default is /user.

<property>
  <name>dfs.user.home.dir.prefix</name>
  <value>/user</value>
  <description>The directory to prepend to user name to get the user's
  home direcotry.
  </description>
</property>

You mentioned wanting to access this in bash.  I think you can combine
"hdfs getconf" with the current user to get it.  For example:


> echo "$(hdfs getconf -confKey dfs.user.home.dir.prefix)/$USER"
/user/chris

If you're running a secured cluster with complex auth-to-local name
conversion rules, then simply using $USER might not be sufficient.  If you
find that's the case, then look at using the "hadoop kerbname <your
Kerberos principal>" command.  The "hadoop kerbname" command alias only
exists in trunk right now.  For 2.x builds, you can still get the same
effect by running the underlying class directly: "hadoop
org.apache.hadoop.security.HadoopKerberosName <your Kerberos principal>".


I hope this helps.

--Chris Nauroth




On 7/7/15, 2:33 AM, "xeonmailinglist" <xe...@gmail.com> wrote:

>Is it possible to get the hadoop working dir using the bash commands?


Re: get the hadoop working dir using the bash commands?

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello,

Are you looking for the working directory used by HDFS to resolve relative
paths used in commands like "hdfs dfs -ls myRelativePath"?  If so, then
the working directory is the current user's home directory.  HDFS defines
the home directory as a common prefix followed by the username.  The
prefix is controlled by configuration property dfs.user.home.dir.prefix,
and the default is /user.

<property>
  <name>dfs.user.home.dir.prefix</name>
  <value>/user</value>
  <description>The directory to prepend to user name to get the user's
  home direcotry.
  </description>
</property>

You mentioned wanting to access this in bash.  I think you can combine
"hdfs getconf" with the current user to get it.  For example:


> echo "$(hdfs getconf -confKey dfs.user.home.dir.prefix)/$USER"
/user/chris

If you're running a secured cluster with complex auth-to-local name
conversion rules, then simply using $USER might not be sufficient.  If you
find that's the case, then look at using the "hadoop kerbname <your
Kerberos principal>" command.  The "hadoop kerbname" command alias only
exists in trunk right now.  For 2.x builds, you can still get the same
effect by running the underlying class directly: "hadoop
org.apache.hadoop.security.HadoopKerberosName <your Kerberos principal>".


I hope this helps.

--Chris Nauroth




On 7/7/15, 2:33 AM, "xeonmailinglist" <xe...@gmail.com> wrote:

>Is it possible to get the hadoop working dir using the bash commands?


Re: get the hadoop working dir using the bash commands?

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hello,

Are you looking for the working directory used by HDFS to resolve relative
paths used in commands like "hdfs dfs -ls myRelativePath"?  If so, then
the working directory is the current user's home directory.  HDFS defines
the home directory as a common prefix followed by the username.  The
prefix is controlled by configuration property dfs.user.home.dir.prefix,
and the default is /user.

<property>
  <name>dfs.user.home.dir.prefix</name>
  <value>/user</value>
  <description>The directory to prepend to user name to get the user's
  home direcotry.
  </description>
</property>

You mentioned wanting to access this in bash.  I think you can combine
"hdfs getconf" with the current user to get it.  For example:


> echo "$(hdfs getconf -confKey dfs.user.home.dir.prefix)/$USER"
/user/chris

If you're running a secured cluster with complex auth-to-local name
conversion rules, then simply using $USER might not be sufficient.  If you
find that's the case, then look at using the "hadoop kerbname <your
Kerberos principal>" command.  The "hadoop kerbname" command alias only
exists in trunk right now.  For 2.x builds, you can still get the same
effect by running the underlying class directly: "hadoop
org.apache.hadoop.security.HadoopKerberosName <your Kerberos principal>".


I hope this helps.

--Chris Nauroth




On 7/7/15, 2:33 AM, "xeonmailinglist" <xe...@gmail.com> wrote:

>Is it possible to get the hadoop working dir using the bash commands?