You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Mahadev konar (JIRA)" <ji...@apache.org> on 2008/04/18 02:21:21 UTC
[jira] Updated: (HADOOP-1593) FsShell should work with paths in
non-default FileSystem
[ https://issues.apache.org/jira/browse/HADOOP-1593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Mahadev konar updated HADOOP-1593:
----------------------------------
Release Note:
This bug allows non default path to specifeid in fsshell commands.
So, you can now specify hadoop dfs -ls hdfs://remotehost1:port/path
and hadoop dfs -ls hdfs://remotehost2:port/path without changing the config.
> FsShell should work with paths in non-default FileSystem
> --------------------------------------------------------
>
> Key: HADOOP-1593
> URL: https://issues.apache.org/jira/browse/HADOOP-1593
> Project: Hadoop Core
> Issue Type: New Feature
> Components: fs
> Reporter: Doug Cutting
> Assignee: Mahadev konar
> Fix For: 0.17.0
>
> Attachments: patch_1593.patch, patch_1593_1.patch
>
>
> If the default filesystem is, e.g., hdfs://foo:8888/, one should still be able to do 'bin/hadoop fs -ls hdfs://bar:9999/' or 'bin/hadoop fs -ls s3://cutting/foo'. Currently these generate a filesystem mismatch exception. This is because FsShell assumes that all paths are in the default FileSystem. Rather, the default filesystem should only be used for paths that do not specify a FileSystem. This would easily be accomplished by using Path#getFileSystem().
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.