You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Tim Robertson <ti...@gmail.com> on 2014/11/05 15:50:12 UTC

No FileSystem for scheme: file

Hi all,

I'm seeing the following
  java.io.IOException: No FileSystem for scheme: file
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
        ...
at
org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
at
org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
at
org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
at
org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
        ....

I have the hadoop hdfs jar on the CP, and I am submitting using all
mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and also
provide the hdfs jar on the CP directly) and the dependency tree is:
  https://gist.github.com/timrobertson100/027f97d038df53cc836f

It has all worked before, but I'm doing a migration to Yarn and 0.98 (CDH
5.2.0) from 0.94 and MR1 so obviously have messed up the CP somehow.

Has anyone come across this please?

Thanks,
Tim

Re: No FileSystem for scheme: file

Posted by Stack <st...@duboce.net>.
Hey Tim:

Add hadoop-common? It has the 'file:///' implementation (look for
LocalFileSystem).  See if that works.

Hope all is well,
St.Ack

On Wed, Nov 5, 2014 at 7:45 AM, Tim Robertson <ti...@gmail.com>
wrote:

> Hi Sean,
>
> We are using CM, and Hue, Hive etc all work, but for some reason I can't
> get the CP correct for this job which I submit using:
>
> java -cp
>
> :$HADOOP_HOME/hdfs/hadoop-hdfs-2.5.0-cdh5.2.0.jar:./:target/classes:target/cube-0.17-SNAPSHOT-jar-with-dependencies.jar
> org.gbif.metrics.cube.occurrence.backfill.Backfill
>
> Thanks,
> Tim
>
>
> On Wed, Nov 5, 2014 at 4:30 PM, Sean Busbey <bu...@cloudera.com> wrote:
>
> > How are you submitting the job?
> >
> > How are your cluster configuration files deployed (i.e. are you using
> CM)?
> >
> > On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <timrobertson100@gmail.com
> >
> > wrote:
> >
> > > Hi all,
> > >
> > > I'm seeing the following
> > >   java.io.IOException: No FileSystem for scheme: file
> > > at
> > org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> > > at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> > > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> > >         ...
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
> > >         ....
> > >
> > > I have the hadoop hdfs jar on the CP, and I am submitting using all
> > > mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and also
> > > provide the hdfs jar on the CP directly) and the dependency tree is:
> > >   https://gist.github.com/timrobertson100/027f97d038df53cc836f
> > >
> > > It has all worked before, but I'm doing a migration to Yarn and 0.98
> (CDH
> > > 5.2.0) from 0.94 and MR1 so obviously have messed up the CP somehow.
> > >
> > > Has anyone come across this please?
> > >
> > > Thanks,
> > > Tim
> > >
> >
> >
> >
> > --
> > Sean
> >
>

Re: No FileSystem for scheme: file

Posted by Tim Robertson <ti...@gmail.com>.
Thanks for taking the time to detail the explanation Walter, that was
indeed the issue.





On Wed, Nov 5, 2014 at 6:08 PM, Walter King <wa...@adroll.com> wrote:

> We ran into this issue.  This post:
>
> http://stackoverflow.com/questions/17265002/hadoop-no-filesystem-for-scheme-file
> was helpful.
>
> "Differents JARs (hadoop-commons for LocalFileSystem, hadoop-hdfs for
> DistributedFileSystem) each contain a different file called
> org.apache.hadoop.fs.FileSystem in their META-INFO/servicesdirectory."
>
> Basically these two files need to be merged into one and put into the jar
> at META-INFO/services:
>
> # this is a concatenation of two FileSystems files from two different
> dependencies
>
> # that overwrite each other during maven builds. This file overrides all
> other files
>
> # with the same name. See
>
> http://stackoverflow.com/questions/17265002/hadoop-no-filesystem-for-scheme-file
>
>
> org.apache.hadoop.fs.LocalFileSystem
>
> org.apache.hadoop.fs.viewfs.ViewFileSystem
>
> org.apache.hadoop.fs.s3.S3FileSystem
>
> org.apache.hadoop.fs.s3native.NativeS3FileSystem
>
> org.apache.hadoop.fs.ftp.FTPFileSystem
>
> org.apache.hadoop.fs.HarFileSystem
>
>
> org.apache.hadoop.hdfs.DistributedFileSystem
>
> org.apache.hadoop.hdfs.HftpFileSystem
>
> org.apache.hadoop.hdfs.HsftpFileSystem
>
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem
>
> On Wed, Nov 5, 2014 at 8:11 AM, Tim Robertson <ti...@gmail.com>
> wrote:
>
> > Thanks St.Ack, Sean
> >
> > I'll change the submission process first thing tomorrow - hadoop-common
> is
> > on the CP (in the fat jar) and it did work before I started ripping out
> the
> > MR1 stuff.
> >
> > [Things are good St.Ack - thanks.  Hope you're also well]
> >
> > On Wed, Nov 5, 2014 at 4:59 PM, Sean Busbey <bu...@cloudera.com> wrote:
> >
> > > The error sounds like you do not have your HDFS configs in the
> classpath.
> > >
> > > Generally, you should be submitting the job via the 'hadoop jar'
> command
> > > (and your main class should be implementing Tool). This will take care
> of
> > > setting the correct classpath for both the Hadoop related jars and
> > > configuration files. See the Ref Guide section on running MapReduce
> > > jobs[1].
> > >
> > > With this approach, you can list the hadoop / hbase related artifacts
> as
> > > "provided" in your pom. Hadoop will add what it needs, and
> > > TableMapReduceUtil will add the jars needed for HBase.
> > >
> > > In addition, you should use the -libjars argument to that command if
> you
> > > need things other than your application jar (and using this for
> > > dependencies is preferable to building a jar-with-dependencies).
> > >
> > > Overall, this sounds like a CM/CDH configuration deployment issue and
> not
> > > something specific to HBase. In the future please consider sending
> these
> > > kinds of vendor-specific questions to the community support mechanisms
> of
> > > said vendor. In Cloudera's case, that's http://community.cloudera.com/
> > >
> > > -Sean
> > >
> > > [1]: http://hbase.apache.org/book.html#mapreduce
> > >
> > > On Wed, Nov 5, 2014 at 9:45 AM, Tim Robertson <
> timrobertson100@gmail.com
> > >
> > > wrote:
> > >
> > > > Hi Sean,
> > > >
> > > > We are using CM, and Hue, Hive etc all work, but for some reason I
> > can't
> > > > get the CP correct for this job which I submit using:
> > > >
> > > > java -cp
> > > >
> > > >
> > >
> >
> :$HADOOP_HOME/hdfs/hadoop-hdfs-2.5.0-cdh5.2.0.jar:./:target/classes:target/cube-0.17-SNAPSHOT-jar-with-dependencies.jar
> > > > org.gbif.metrics.cube.occurrence.backfill.Backfill
> > > >
> > > > Thanks,
> > > > Tim
> > > >
> > > >
> > > > On Wed, Nov 5, 2014 at 4:30 PM, Sean Busbey <bu...@cloudera.com>
> > wrote:
> > > >
> > > > > How are you submitting the job?
> > > > >
> > > > > How are your cluster configuration files deployed (i.e. are you
> using
> > > > CM)?
> > > > >
> > > > > On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <
> > > timrobertson100@gmail.com
> > > > >
> > > > > wrote:
> > > > >
> > > > > > Hi all,
> > > > > >
> > > > > > I'm seeing the following
> > > > > >   java.io.IOException: No FileSystem for scheme: file
> > > > > > at
> > > > >
> > >
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> > > > > > at
> > > >
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> > > > > > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> > > > > >         ...
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
> > > > > >         ....
> > > > > >
> > > > > > I have the hadoop hdfs jar on the CP, and I am submitting using
> all
> > > > > > mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and
> > > also
> > > > > > provide the hdfs jar on the CP directly) and the dependency tree
> > is:
> > > > > >   https://gist.github.com/timrobertson100/027f97d038df53cc836f
> > > > > >
> > > > > > It has all worked before, but I'm doing a migration to Yarn and
> > 0.98
> > > > (CDH
> > > > > > 5.2.0) from 0.94 and MR1 so obviously have messed up the CP
> > somehow.
> > > > > >
> > > > > > Has anyone come across this please?
> > > > > >
> > > > > > Thanks,
> > > > > > Tim
> > > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > Sean
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > > Sean
> > >
> >
>

Re: No FileSystem for scheme: file

Posted by Walter King <wa...@adroll.com>.
We ran into this issue.  This post:
http://stackoverflow.com/questions/17265002/hadoop-no-filesystem-for-scheme-file
was helpful.

"Differents JARs (hadoop-commons for LocalFileSystem, hadoop-hdfs for
DistributedFileSystem) each contain a different file called
org.apache.hadoop.fs.FileSystem in their META-INFO/servicesdirectory."

Basically these two files need to be merged into one and put into the jar
at META-INFO/services:

# this is a concatenation of two FileSystems files from two different
dependencies

# that overwrite each other during maven builds. This file overrides all
other files

# with the same name. See
http://stackoverflow.com/questions/17265002/hadoop-no-filesystem-for-scheme-file


org.apache.hadoop.fs.LocalFileSystem

org.apache.hadoop.fs.viewfs.ViewFileSystem

org.apache.hadoop.fs.s3.S3FileSystem

org.apache.hadoop.fs.s3native.NativeS3FileSystem

org.apache.hadoop.fs.ftp.FTPFileSystem

org.apache.hadoop.fs.HarFileSystem


org.apache.hadoop.hdfs.DistributedFileSystem

org.apache.hadoop.hdfs.HftpFileSystem

org.apache.hadoop.hdfs.HsftpFileSystem

org.apache.hadoop.hdfs.web.WebHdfsFileSystem

On Wed, Nov 5, 2014 at 8:11 AM, Tim Robertson <ti...@gmail.com>
wrote:

> Thanks St.Ack, Sean
>
> I'll change the submission process first thing tomorrow - hadoop-common is
> on the CP (in the fat jar) and it did work before I started ripping out the
> MR1 stuff.
>
> [Things are good St.Ack - thanks.  Hope you're also well]
>
> On Wed, Nov 5, 2014 at 4:59 PM, Sean Busbey <bu...@cloudera.com> wrote:
>
> > The error sounds like you do not have your HDFS configs in the classpath.
> >
> > Generally, you should be submitting the job via the 'hadoop jar' command
> > (and your main class should be implementing Tool). This will take care of
> > setting the correct classpath for both the Hadoop related jars and
> > configuration files. See the Ref Guide section on running MapReduce
> > jobs[1].
> >
> > With this approach, you can list the hadoop / hbase related artifacts as
> > "provided" in your pom. Hadoop will add what it needs, and
> > TableMapReduceUtil will add the jars needed for HBase.
> >
> > In addition, you should use the -libjars argument to that command if you
> > need things other than your application jar (and using this for
> > dependencies is preferable to building a jar-with-dependencies).
> >
> > Overall, this sounds like a CM/CDH configuration deployment issue and not
> > something specific to HBase. In the future please consider sending these
> > kinds of vendor-specific questions to the community support mechanisms of
> > said vendor. In Cloudera's case, that's http://community.cloudera.com/
> >
> > -Sean
> >
> > [1]: http://hbase.apache.org/book.html#mapreduce
> >
> > On Wed, Nov 5, 2014 at 9:45 AM, Tim Robertson <timrobertson100@gmail.com
> >
> > wrote:
> >
> > > Hi Sean,
> > >
> > > We are using CM, and Hue, Hive etc all work, but for some reason I
> can't
> > > get the CP correct for this job which I submit using:
> > >
> > > java -cp
> > >
> > >
> >
> :$HADOOP_HOME/hdfs/hadoop-hdfs-2.5.0-cdh5.2.0.jar:./:target/classes:target/cube-0.17-SNAPSHOT-jar-with-dependencies.jar
> > > org.gbif.metrics.cube.occurrence.backfill.Backfill
> > >
> > > Thanks,
> > > Tim
> > >
> > >
> > > On Wed, Nov 5, 2014 at 4:30 PM, Sean Busbey <bu...@cloudera.com>
> wrote:
> > >
> > > > How are you submitting the job?
> > > >
> > > > How are your cluster configuration files deployed (i.e. are you using
> > > CM)?
> > > >
> > > > On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <
> > timrobertson100@gmail.com
> > > >
> > > > wrote:
> > > >
> > > > > Hi all,
> > > > >
> > > > > I'm seeing the following
> > > > >   java.io.IOException: No FileSystem for scheme: file
> > > > > at
> > > >
> > org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> > > > > at
> > > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> > > > > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> > > > >         ...
> > > > > at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> > > > > at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> > > > > at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> > > > > at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
> > > > >         ....
> > > > >
> > > > > I have the hadoop hdfs jar on the CP, and I am submitting using all
> > > > > mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and
> > also
> > > > > provide the hdfs jar on the CP directly) and the dependency tree
> is:
> > > > >   https://gist.github.com/timrobertson100/027f97d038df53cc836f
> > > > >
> > > > > It has all worked before, but I'm doing a migration to Yarn and
> 0.98
> > > (CDH
> > > > > 5.2.0) from 0.94 and MR1 so obviously have messed up the CP
> somehow.
> > > > >
> > > > > Has anyone come across this please?
> > > > >
> > > > > Thanks,
> > > > > Tim
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Sean
> > > >
> > >
> >
> >
> >
> > --
> > Sean
> >
>

Re: No FileSystem for scheme: file

Posted by Tim Robertson <ti...@gmail.com>.
Thanks St.Ack, Sean

I'll change the submission process first thing tomorrow - hadoop-common is
on the CP (in the fat jar) and it did work before I started ripping out the
MR1 stuff.

[Things are good St.Ack - thanks.  Hope you're also well]

On Wed, Nov 5, 2014 at 4:59 PM, Sean Busbey <bu...@cloudera.com> wrote:

> The error sounds like you do not have your HDFS configs in the classpath.
>
> Generally, you should be submitting the job via the 'hadoop jar' command
> (and your main class should be implementing Tool). This will take care of
> setting the correct classpath for both the Hadoop related jars and
> configuration files. See the Ref Guide section on running MapReduce
> jobs[1].
>
> With this approach, you can list the hadoop / hbase related artifacts as
> "provided" in your pom. Hadoop will add what it needs, and
> TableMapReduceUtil will add the jars needed for HBase.
>
> In addition, you should use the -libjars argument to that command if you
> need things other than your application jar (and using this for
> dependencies is preferable to building a jar-with-dependencies).
>
> Overall, this sounds like a CM/CDH configuration deployment issue and not
> something specific to HBase. In the future please consider sending these
> kinds of vendor-specific questions to the community support mechanisms of
> said vendor. In Cloudera's case, that's http://community.cloudera.com/
>
> -Sean
>
> [1]: http://hbase.apache.org/book.html#mapreduce
>
> On Wed, Nov 5, 2014 at 9:45 AM, Tim Robertson <ti...@gmail.com>
> wrote:
>
> > Hi Sean,
> >
> > We are using CM, and Hue, Hive etc all work, but for some reason I can't
> > get the CP correct for this job which I submit using:
> >
> > java -cp
> >
> >
> :$HADOOP_HOME/hdfs/hadoop-hdfs-2.5.0-cdh5.2.0.jar:./:target/classes:target/cube-0.17-SNAPSHOT-jar-with-dependencies.jar
> > org.gbif.metrics.cube.occurrence.backfill.Backfill
> >
> > Thanks,
> > Tim
> >
> >
> > On Wed, Nov 5, 2014 at 4:30 PM, Sean Busbey <bu...@cloudera.com> wrote:
> >
> > > How are you submitting the job?
> > >
> > > How are your cluster configuration files deployed (i.e. are you using
> > CM)?
> > >
> > > On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <
> timrobertson100@gmail.com
> > >
> > > wrote:
> > >
> > > > Hi all,
> > > >
> > > > I'm seeing the following
> > > >   java.io.IOException: No FileSystem for scheme: file
> > > > at
> > >
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> > > > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> > > > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> > > >         ...
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
> > > >         ....
> > > >
> > > > I have the hadoop hdfs jar on the CP, and I am submitting using all
> > > > mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and
> also
> > > > provide the hdfs jar on the CP directly) and the dependency tree is:
> > > >   https://gist.github.com/timrobertson100/027f97d038df53cc836f
> > > >
> > > > It has all worked before, but I'm doing a migration to Yarn and 0.98
> > (CDH
> > > > 5.2.0) from 0.94 and MR1 so obviously have messed up the CP somehow.
> > > >
> > > > Has anyone come across this please?
> > > >
> > > > Thanks,
> > > > Tim
> > > >
> > >
> > >
> > >
> > > --
> > > Sean
> > >
> >
>
>
>
> --
> Sean
>

Re: No FileSystem for scheme: file

Posted by Sean Busbey <bu...@cloudera.com>.
The error sounds like you do not have your HDFS configs in the classpath.

Generally, you should be submitting the job via the 'hadoop jar' command
(and your main class should be implementing Tool). This will take care of
setting the correct classpath for both the Hadoop related jars and
configuration files. See the Ref Guide section on running MapReduce jobs[1].

With this approach, you can list the hadoop / hbase related artifacts as
"provided" in your pom. Hadoop will add what it needs, and
TableMapReduceUtil will add the jars needed for HBase.

In addition, you should use the -libjars argument to that command if you
need things other than your application jar (and using this for
dependencies is preferable to building a jar-with-dependencies).

Overall, this sounds like a CM/CDH configuration deployment issue and not
something specific to HBase. In the future please consider sending these
kinds of vendor-specific questions to the community support mechanisms of
said vendor. In Cloudera's case, that's http://community.cloudera.com/

-Sean

[1]: http://hbase.apache.org/book.html#mapreduce

On Wed, Nov 5, 2014 at 9:45 AM, Tim Robertson <ti...@gmail.com>
wrote:

> Hi Sean,
>
> We are using CM, and Hue, Hive etc all work, but for some reason I can't
> get the CP correct for this job which I submit using:
>
> java -cp
>
> :$HADOOP_HOME/hdfs/hadoop-hdfs-2.5.0-cdh5.2.0.jar:./:target/classes:target/cube-0.17-SNAPSHOT-jar-with-dependencies.jar
> org.gbif.metrics.cube.occurrence.backfill.Backfill
>
> Thanks,
> Tim
>
>
> On Wed, Nov 5, 2014 at 4:30 PM, Sean Busbey <bu...@cloudera.com> wrote:
>
> > How are you submitting the job?
> >
> > How are your cluster configuration files deployed (i.e. are you using
> CM)?
> >
> > On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <timrobertson100@gmail.com
> >
> > wrote:
> >
> > > Hi all,
> > >
> > > I'm seeing the following
> > >   java.io.IOException: No FileSystem for scheme: file
> > > at
> > org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> > > at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> > > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> > >         ...
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
> > >         ....
> > >
> > > I have the hadoop hdfs jar on the CP, and I am submitting using all
> > > mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and also
> > > provide the hdfs jar on the CP directly) and the dependency tree is:
> > >   https://gist.github.com/timrobertson100/027f97d038df53cc836f
> > >
> > > It has all worked before, but I'm doing a migration to Yarn and 0.98
> (CDH
> > > 5.2.0) from 0.94 and MR1 so obviously have messed up the CP somehow.
> > >
> > > Has anyone come across this please?
> > >
> > > Thanks,
> > > Tim
> > >
> >
> >
> >
> > --
> > Sean
> >
>



-- 
Sean

Re: No FileSystem for scheme: file

Posted by Tim Robertson <ti...@gmail.com>.
Hi Sean,

We are using CM, and Hue, Hive etc all work, but for some reason I can't
get the CP correct for this job which I submit using:

java -cp
:$HADOOP_HOME/hdfs/hadoop-hdfs-2.5.0-cdh5.2.0.jar:./:target/classes:target/cube-0.17-SNAPSHOT-jar-with-dependencies.jar
org.gbif.metrics.cube.occurrence.backfill.Backfill

Thanks,
Tim


On Wed, Nov 5, 2014 at 4:30 PM, Sean Busbey <bu...@cloudera.com> wrote:

> How are you submitting the job?
>
> How are your cluster configuration files deployed (i.e. are you using CM)?
>
> On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <ti...@gmail.com>
> wrote:
>
> > Hi all,
> >
> > I'm seeing the following
> >   java.io.IOException: No FileSystem for scheme: file
> > at
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> >         ...
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
> >         ....
> >
> > I have the hadoop hdfs jar on the CP, and I am submitting using all
> > mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and also
> > provide the hdfs jar on the CP directly) and the dependency tree is:
> >   https://gist.github.com/timrobertson100/027f97d038df53cc836f
> >
> > It has all worked before, but I'm doing a migration to Yarn and 0.98 (CDH
> > 5.2.0) from 0.94 and MR1 so obviously have messed up the CP somehow.
> >
> > Has anyone come across this please?
> >
> > Thanks,
> > Tim
> >
>
>
>
> --
> Sean
>

Re: No FileSystem for scheme: file

Posted by Sean Busbey <bu...@cloudera.com>.
How are you submitting the job?

How are your cluster configuration files deployed (i.e. are you using CM)?

On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <ti...@gmail.com>
wrote:

> Hi all,
>
> I'm seeing the following
>   java.io.IOException: No FileSystem for scheme: file
> at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>         ...
> at
>
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> at
>
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> at
>
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> at
>
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
>         ....
>
> I have the hadoop hdfs jar on the CP, and I am submitting using all
> mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and also
> provide the hdfs jar on the CP directly) and the dependency tree is:
>   https://gist.github.com/timrobertson100/027f97d038df53cc836f
>
> It has all worked before, but I'm doing a migration to Yarn and 0.98 (CDH
> 5.2.0) from 0.94 and MR1 so obviously have messed up the CP somehow.
>
> Has anyone come across this please?
>
> Thanks,
> Tim
>



-- 
Sean