You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Vasco Pinho <va...@hotjar.com> on 2017/04/05 16:19:28 UTC

Export snapshot to S3

Hello everyone, after setting up a hbase cluster (hbase 1.2.4, hadoop
2.7.3, java 8), trying to export a snapshot to S3 is giving me an error:

bin/hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot
175804042016-snap -copy-from hdfs://master1.domain.com:8020/hbase -copy-to
s3a://hbase-bucket/175804042016-snap -overwrite
org.apache.hadoop.hbase.snapshot.ExportSnapshot
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/root/.m2/repository/org/slf4j/slf4j-log4j12/1.7.7/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/ubuntu/hbase-1.2.4/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2017-04-05 16:02:05,347 WARN  [main] util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2017-04-05 16:02:06,837 INFO  [main] snapshot.ExportSnapshot: Copy Snapshot
Manifest
Exception in thread "main" java.lang.NullPointerException
        at
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:268)
        at
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
        at
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.createTmpFileForWrite(LocalDirAllocator.java:416)
        at
org.apache.hadoop.fs.LocalDirAllocator.createTmpFileForWrite(LocalDirAllocator.java:198)
        at
org.apache.hadoop.fs.s3a.S3AOutputStream.<init>(S3AOutputStream.java:87)
        at
org.apache.hadoop.fs.s3a.S3AFileSystem.create(S3AFileSystem.java:410)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:887)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:784)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:365)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:356)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
        at
org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:996)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at
org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:1102)
        at
org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:1106)


My temp dir is the default one and I can check it is indeed created under
/tmp/hbase-username. It is empty except for this path though (whereas root
is the username): /tmp/hbase-root/local/jars/

In S3, in said bucket, this path appears although it is also empty
otherwise:
hbase-bucket/175804042016-snap/.hbase-snapshot/.tmp/175804042016-snap

To make sure my setup was OK otherwise, I attached another EBS disk and
mounted on a different path, changed the tmp dir config to that and
retried. Same error. To further rule out possible mistakes I've tried an
export to a file:// instead, and that worked without any problems.

Two questions then:
1) What might be causing the error when exporting to S3 (permissions
obviously work since stuff is getting created there). Looks like it can't
create a folder locally, but then when exporting to file:// everything goes
smooth with same user etc
2) If I don't get this working, is there anything that wouldn't work with
an export to file:// and subsequent "manual" upload to S3 with awscli? From
my understanding snapshots never get too large since it's basically
pointers to things so it wouldn't be a problem.


PS: This is my first post to this list so if something breaks the rules
please let me know.

-- 
Vasco Pinho
DevOps Engineer
www.hotjar.com
Connect with me on LinkedIn
<https://pt.linkedin.com/in/vasco-pinho-58770534>
IMPORTANT CONFIDENTIALITY NOTICE: This message is confidential and intended
for the use only of the person to whom this message is addressed. If you
are not the intended recipient you are strictly prohibited from reading,
disseminating, copying or using this message, or its contents in any way
and must contact the sender immediately.