You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Steve Loughran (JIRA)" <ji...@apache.org> on 2017/09/21 14:01:00 UTC

[jira] [Created] (HADOOP-14890) Move up to AWS SDK 1.11.199

Steve Loughran created HADOOP-14890:
---------------------------------------

             Summary: Move up to AWS SDK 1.11.199
                 Key: HADOOP-14890
                 URL: https://issues.apache.org/jira/browse/HADOOP-14890
             Project: Hadoop Common
          Issue Type: Sub-task
          Components: build, fs/s3
    Affects Versions: 3.0.0-beta1
            Reporter: Steve Loughran
            Assignee: Steve Loughran


the AWS SDK in Hadoop 3.0.-beta-1 prints a warning whenever you call abort() on a stream, which is what we need to do whenever doing long-distance seeks in a large file opened with fadvise=normal

{code}
2017-09-20 17:51:50,459 [ScalaTest-main-running-S3ASeekReadSuite] INFO  s3.S3ASeekReadSuite (Logging.scala:logInfo(54)) - 
2017-09-20 17:51:50,460 [ScalaTest-main-running-S3ASeekReadSuite] INFO  s3.S3ASeekReadSuite (Logging.scala:logInfo(54)) - Starting read() [pos = 45603305]
2017-09-20 17:51:50,461 [ScalaTest-main-running-S3ASeekReadSuite] WARN  internal.S3AbortableInputStream (S3AbortableInputStream.java:close(163)) - Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection. This is likely an error and may result in sub-optimal behavior. Request only the bytes you need via a ranged GET or drain the input stream after use.
2017-09-20 17:51:51,263 [ScalaTest-main-running-S3ASeekReadSuite] INFO  s3.S3ASeekReadSuite (Logging.scala:logInfo(54)) - Duration of read() [pos = 45603305] = 803,650,637 nS
{code}

This goes away if we upgrade to the latest SDK, at least for the non-localdynamo bits



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org