You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Steve Loughran (JIRA)" <ji...@apache.org> on 2015/10/21 12:09:27 UTC

[jira] [Resolved] (HADOOP-12420) While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V

     [ https://issues.apache.org/jira/browse/HADOOP-12420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Steve Loughran resolved HADOOP-12420.
-------------------------------------
       Resolution: Fixed
    Fix Version/s: 2.8.0

Marking as fixed for Hadoop 2.8, the library change fixes that. If you see it in earlier versions, you've got an incompatible version of aws-java-sdk, as Amazon have broken their java method signature

> While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-12420
>                 URL: https://issues.apache.org/jira/browse/HADOOP-12420
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3
>    Affects Versions: 2.7.1
>            Reporter: Tariq Mohammad
>            Assignee: Tariq Mohammad
>            Priority: Minor
>             Fix For: 2.8.0
>
>
> While trying to access data stored in Amazon S3 through Apache Spark, which  internally uses hadoop-aws jar I was getting the following exception :
> Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
> Probable reason could be the fact that aws java sdk expects a long parameter for the setMultipartUploadThreshold(long multiPartThreshold) method, but hadoop-aws was using a parameter of type int(multiPartThreshold). 
> I tried using the downloaded hadoop-aws jar and the build through its maven dependency, but in both the cases I encountered the same exception. Although I can see private long multiPartThreshold; in hadoop-aws GitHub repo, it's not getting reflected in the downloaded jar or in the jar created from maven dependency.
> Following lines in the S3AFileSystem class create this difference :
> Build from trunk : 
> private long multiPartThreshold;
> this.multiPartThreshold = conf.getLong("fs.s3a.multipart.threshold", 2147483647L); => Line 267
> Build through maven dependency : 
> private int multiPartThreshold;
> multiPartThreshold = conf.getInt(MIN_MULTIPART_THRESHOLD, DEFAULT_MIN_MULTIPART_THRESHOLD); => Line 249



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)