You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Bo Shi <bs...@gmail.com> on 2011/07/01 04:20:08 UTC

Bizarre S3 access issue

*
Hey everyone, I have a bizarre issue with S3 access that I've been
struggling with the past day or so.  I'm wondering if anyone can provide any
hints even as to where to start investigating the problem.

**
The s3.xml is identical on both machines.  The java releases aren't _quite_
exact but they are both Sun JDK 1.6.0.
**
I have a s3.xml Configuration file with the S3n access keys defined and have
them on two test machines on EC2 (ubuntu 10.04 and Centos 5).  I have also
written a short demonstration class that simply accesses a file and prints
some information about it.  On Centos, I get the following HEAD Request
Failed with Forbidden.  There is no XML response content as far as I can
tell.
**


*
*[root@rhhost ~]# java -version*
*java version "1.6.0_26"*
*Java(TM) SE Runtime Environment (build 1.6.0_26-b03)*
*Java HotSpot(TM) Server VM (build 20.1-b02, mixed mode)*
*
*
*[root@rhhost ~]# java c.t.b.m.l.S3Check file:///root/s3.xml
s3n://my-bucket/s/2011-06-28/part-00000.bz2*
*Exception in thread "main" org.apache.hadoop.fs.s3.S3Exception:
org.jets3t.service.S3ServiceException: S3 HEAD request failed for
'/s%2F2011-06-28%2Fpart-00000.bz2' - ResponseCode=403,
ResponseMessage=Forbidden*
* at
org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:122)
*
* at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
* at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
*
* at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
*
* at java.lang.reflect.Method.invoke(Method.java:597)*
* at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
*
* at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
*
* at org.apache.hadoop.fs.s3native.$Proxy1.retrieveMetadata(Unknown Source)*
* at
org.apache.hadoop.fs.s3native.NativeS3FileSystem.listStatus(NativeS3FileSystem.java:365)
*
* at c.t.b.m.l.S3Check.info(S3Check.java:14)*
* at c.t.b.m.l.S3Check.main(S3Check.java:23)*
*Caused by: org.jets3t.service.S3ServiceException: S3 HEAD request failed
for '/s%2F2011-06-28%2Fpart-00000.bz2' - ResponseCode=403,
ResponseMessage=Forbidden*
* at
org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:485)
*
* at
org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestHead(RestS3Service.java:652)
*
* at
org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectImpl(RestS3Service.java:1556)
*
* at
org.jets3t.service.impl.rest.httpclient.RestS3Service.getObjectDetailsImpl(RestS3Service.java:1492)
*
* at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1793)*
* at org.jets3t.service.S3Service.getObjectDetails(S3Service.java:1225)*
* at
org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:111)
*
* ... 10 more*
*Caused by: org.jets3t.service.impl.rest.HttpException*
* at
org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:483)
*
* ... 16 more*
*
*
*
*
*[root@rhhost ~]# cat s3.xml *
*<?xml version="1.0" encoding="UTF-8" ?>*
*<configuration>*
*  <property>*
*    <name>fs.s3n.awsAccessKeyId</name>*
*    <value>*******</value>*
*  </property>*
*  <property>*
*    <name>fs.s3n.awsSecretAccessKey</name>*
*    <value>*******</value>*
*  </property>*
*</configuration>*


But on the Ubuntu EC2 image, things work.  I have Hadoop 0.20.203.0
libraries in my CLASSPATH.


*ubuntu@debhost:~$ java -version*
*java version "1.6.0_24"*
*Java(TM) SE Runtime Environment (build 1.6.0_24-b07)*
*Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02, mixed mode)*
*
*
*ubuntu@debhost:~$ java c.t.b.m.l.S3Check file:///home/ubuntu/s3.xml
s3n://my-bucket/s/2011-06-28/part-00000.bz2*
*len 3279457*
*mtime 1309318120000*
*repl 1*


Code follows:


*package c.t.b.m.l;*
*
*
*import org.apache.hadoop.conf.Configuration;*
*import org.apache.hadoop.fs.FileStatus;*
*import org.apache.hadoop.fs.FileSystem;*
*import org.apache.hadoop.fs.Path;*
*
*
*import java.io.IOException;*
*
*
*
*
*public class S3Check {*
*  public static void info(Configuration conf, Path path) throws IOException
{*
*    FileSystem fs = FileSystem.get(path.toUri(), conf);*
*    FileStatus [] status = fs.listStatus(path);*
*    System.out.println("len " + status[0].getLen());*
*    System.out.println("mtime " + status[0].getModificationTime());*
*    System.out.println("repl " + status[0].getReplication());*
*  }*
*
*
*  public static void main(String [] args) throws Exception {*
*    Configuration conf = new Configuration();*
*    conf.addResource(new Path(args[0]));*
*    info(conf, new Path(args[1]));*
*  }*
*}*


-- 
Bo Shi
617-942-1744