You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Akmal Abbasov <ak...@icloud.com> on 2015/03/17 12:38:31 UTC

java.io.FileNotFoundException in exportSnapshot

Hi, I have 2 clusters running HBase, and I want to export a snapshot from cluster A to cluster B.
When I am doing exportSnapshot I am getting java.io.FileNotFoundException, because it is searching for a jar file in hdfs, not in my local storage.
Any ideas how it could be solved?

Here is an output:
2015-03-17 11:30:26,310 INFO  [main] Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2015-03-17 11:30:26,312 INFO  [main] jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
2015-03-17 11:30:27,383 INFO  [main] mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-hadoop/mapred/staging/hadoop806729561/.staging/job_local806729561_0001
2015-03-17 11:30:27,387 ERROR [main] snapshot.ExportSnapshot: Snapshot export failed
java.io.FileNotFoundException: File does not exist: hdfs://hiveprodeuw1/opt/hadoop/hbase-0.98.7-hadoop2/lib/hadoop-common-2.5.1.jar
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
	at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
	at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.runCopyJob(ExportSnapshot.java:768)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:925)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:991)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:995)

Re: java.io.FileNotFoundException in exportSnapshot

Posted by Akmal Abbasov <ak...@icloud.com>.
Hi Ted,
I am using hbase-098.7, hadoop-2.5.1
The command I am using is 
./hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot mysnap -copy-to hdfs://backupnode/hbase

> On 17 Mar 2015, at 14:04, Ted Yu <yu...@gmail.com> wrote:
> 
> Can you show us the command you used ?
> 
> What hbase release are you using ?
> 
> Thanks
> 
> 
> 
>> On Mar 17, 2015, at 4:38 AM, Akmal Abbasov <ak...@icloud.com> wrote:
>> 
>> Hi, I have 2 clusters running HBase, and I want to export a snapshot from cluster A to cluster B.
>> When I am doing exportSnapshot I am getting java.io.FileNotFoundException, because it is searching for a jar file in hdfs, not in my local storage.
>> Any ideas how it could be solved?
>> 
>> Here is an output:
>> 2015-03-17 11:30:26,310 INFO  [main] Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
>> 2015-03-17 11:30:26,312 INFO  [main] jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
>> 2015-03-17 11:30:27,383 INFO  [main] mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-hadoop/mapred/staging/hadoop806729561/.staging/job_local806729561_0001
>> 2015-03-17 11:30:27,387 ERROR [main] snapshot.ExportSnapshot: Snapshot export failed
>> java.io.FileNotFoundException: File does not exist: hdfs://namenode/opt/hadoop/hbase-0.98.7-hadoop2/lib/hadoop-common-2.5.1.jar
>>   at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
>>   at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
>>   at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>   at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
>>   at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
>>   at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
>>   at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
>>   at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
>>   at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
>>   at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>>   at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
>>   at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>>   at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>>   at java.security.AccessController.doPrivileged(Native Method)
>>   at javax.security.auth.Subject.doAs(Subject.java:415)
>>   at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>>   at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>>   at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>>   at org.apache.hadoop.hbase.snapshot.ExportSnapshot.runCopyJob(ExportSnapshot.java:768)
>>   at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:925)
>>   at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>   at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:991)
>>   at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:995)


Re: java.io.FileNotFoundException in exportSnapshot

Posted by Ted Yu <yu...@gmail.com>.
Can you show us the command you used ?

What hbase release are you using ?

Thanks



> On Mar 17, 2015, at 4:38 AM, Akmal Abbasov <ak...@icloud.com> wrote:
> 
> Hi, I have 2 clusters running HBase, and I want to export a snapshot from cluster A to cluster B.
> When I am doing exportSnapshot I am getting java.io.FileNotFoundException, because it is searching for a jar file in hdfs, not in my local storage.
> Any ideas how it could be solved?
> 
> Here is an output:
> 2015-03-17 11:30:26,310 INFO  [main] Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
> 2015-03-17 11:30:26,312 INFO  [main] jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
> 2015-03-17 11:30:27,383 INFO  [main] mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-hadoop/mapred/staging/hadoop806729561/.staging/job_local806729561_0001
> 2015-03-17 11:30:27,387 ERROR [main] snapshot.ExportSnapshot: Snapshot export failed
> java.io.FileNotFoundException: File does not exist: hdfs://hiveprodeuw1/opt/hadoop/hbase-0.98.7-hadoop2/lib/hadoop-common-2.5.1.jar
>    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
>    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
>    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
>    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
>    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
>    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
>    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
>    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
>    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
>    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>    at java.security.AccessController.doPrivileged(Native Method)
>    at javax.security.auth.Subject.doAs(Subject.java:415)
>    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>    at org.apache.hadoop.hbase.snapshot.ExportSnapshot.runCopyJob(ExportSnapshot.java:768)
>    at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:925)
>    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>    at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:991)
>    at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:995)

Re: java.io.FileNotFoundException in exportSnapshot

Posted by Akmal Abbasov <ak...@icloud.com>.
Is this a bug?
I’ve tried the same configuration files with hadoop 2.6.0, hbase 1.0.0 and it is working. But when I am using the same configuration files, and doing the same thing with hadoop 2.5.1 and hbase 0.98.7-hadoop2 I have this errors

hbase-0.98.7-hadoop2/bin/hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot mysnap -copy-to hdfs://198.58.88.31:9000/hbase/test
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/vagrant/hbase-0.98.7-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/vagrant/hadoop-2.5.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2015-03-17 16:38:08,815 INFO  [main] snapshot.ExportSnapshot: Copy Snapshot Manifest
2015-03-17 16:38:09,235 INFO  [main] Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2015-03-17 16:38:09,236 INFO  [main] jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
2015-03-17 16:38:09,334 INFO  [main] mapreduce.JobSubmitter: Cleaning up the staging area file:/home/vagrant/hadoop-2.5.1/hadoop-datastore/mapred/staging/vagrant1812768224/.staging/job_local1812768224_0001
2015-03-17 16:38:09,335 ERROR [main] snapshot.ExportSnapshot: Snapshot export failed
java.io.FileNotFoundException: File does not exist: hdfs://namenode:9000/home/vagrant/hbase-0.98.7-hadoop2/lib/hadoop-mapreduce-client-core-2.5.1.jar
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
	at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
	at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.runCopyJob(ExportSnapshot.java:768)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:925)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:991)
	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:995)

Any ideas?
 
> On 17 Mar 2015, at 12:38, Akmal Abbasov <ak...@icloud.com> wrote:
> 
> Hi, I have 2 clusters running HBase, and I want to export a snapshot from cluster A to cluster B.
> When I am doing exportSnapshot I am getting java.io.FileNotFoundException, because it is searching for a jar file in hdfs, not in my local storage.
> Any ideas how it could be solved?
> 
> Here is an output:
> 2015-03-17 11:30:26,310 INFO  [main] Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
> 2015-03-17 11:30:26,312 INFO  [main] jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
> 2015-03-17 11:30:27,383 INFO  [main] mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-hadoop/mapred/staging/hadoop806729561/.staging/job_local806729561_0001
> 2015-03-17 11:30:27,387 ERROR [main] snapshot.ExportSnapshot: Snapshot export failed
> java.io.FileNotFoundException: File does not exist: hdfs://hiveprodeuw1/opt/hadoop/hbase-0.98.7-hadoop2/lib/hadoop-common-2.5.1.jar
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
> 	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
> 	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
> 	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
> 	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
> 	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
> 	at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
> 	at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
> 	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
> 	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
> 	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> 	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
> 	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
> 	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.runCopyJob(ExportSnapshot.java:768)
> 	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:925)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> 	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:991)
> 	at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:995)