You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Steve Loughran (Jira)" <ji...@apache.org> on 2019/10/08 09:30:00 UTC

[jira] [Commented] (HADOOP-16642) ITestDynamoDBMetadataStoreScale failing as the error text does not match expectations

    [ https://issues.apache.org/jira/browse/HADOOP-16642?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16946658#comment-16946658 ] 

Steve Loughran commented on HADOOP-16642:
-----------------------------------------

{code}
3:06:22 java.lang.AssertionError: 
13:06:22 Expected throttling message:  Expected to find ' This may be because the write threshold of DynamoDB is set too low.' 
but got unexpected exception: org.apache.hadoop.fs.s3a.AWSServiceThrottledException: 

Put tombstone on s3a://fake-bucket/moved-here: com.amazonaws.services.dynamodbv2.model.ProvisionedThroughputExceededException: 
The level of configured provisioned throughput for the table was exceeded. 
Consider increasing your provisioning level with the UpdateTable API. 
(Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ProvisionedThroughputExceededException; 
Request ID: L12H9UM7PE8K0ILPGGTF4QG367VV4KQNSO5AEMVJF66Q9ASUAAJG): 
The level of configured provisioned throughput for the table was exceeded. 
Consider increasing your provisioning level with the UpdateTable API. 
(Service: AmazonDynamoDBv2; Status Code: 400; 
Error Code: ProvisionedThroughputExceededException; Request ID: L12H9UM7PE8K0ILPGGTF4QG367VV4KQNSO5AEMVJF66Q9ASUAAJG)
13:06:22 	at org.apache.hadoop.fs.s3a.S3AUtils.translateDynamoDBException(S3AUtils.java:402)
13:06:22 	at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:193)
13:06:22 	at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111)
13:06:22 	at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:265)
13:06:22 	at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:322)
13:06:22 	at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:261)
13:06:22 	at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:236)
13:06:22 	at org.apache.hadoop.fs.s3a.s3guard.DynamoDBMetadataStore.innerDelete(DynamoDBMetadataStore.java:490)
13:06:22 	at org.apache.hadoop.fs.s3a.s3guard.DynamoDBMetadataStore.deleteSubtree(DynamoDBMetadataStore.java:520)
13:06:22 	at org.apache.hadoop.fs.s3a.scale.AbstractITestS3AMetadataStoreScale.clearMetadataStore(AbstractITestS3AMetadataStoreScale.java:196)
13:06:22 	at org.apache.hadoop.fs.s3a.scale.AbstractITestS3AMetadataStoreScale.test_020_Moves(AbstractITestS3AMetadataStoreScale.java:138)
13:06:22 	at org.apache.hadoop.fs.s3a.s3guard.ITestDynamoDBMetadataStoreScale.test_020_Moves(ITestDynamoDBMetadataStoreScale.java:184)
13:06:22 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
13:06:22 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
13:06:22 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
13:06:22 	at java.lang.reflect.Method.invoke(Method.java:498)
13:06:22 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
13:06:22 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
13:06:22 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
13:06:22 	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
13:06:22 	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
13:06:22 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
13:06:22 	at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
13:06:22 	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
13:06:22 Caused by: com.amazonaws.services.dynamodbv2.model.ProvisionedThroughputExceededException: The level of configured provisioned throughput for the table was exceeded. Consider increasing your provisioning level with the UpdateTable API. (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ProvisionedThroughputExceededException; Request ID: L12H9UM7PE8K0ILPGGTF4QG367VV4KQNSO5AEMVJF66Q9ASUAAJG)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1640)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1304)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1058)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:743)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
13:06:22 	at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
13:06:22 	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
13:06:22 	at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.doInvoke(AmazonDynamoDBClient.java:3443)
13:06:22 	at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:3419)
13:06:22 	at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.executePutItem(AmazonDynamoDBClient.java:2194)
13:06:22 	at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.putItem(AmazonDynamoDBClient.java:2169)
13:06:22 	at com.amazonaws.services.dynamodbv2.document.internal.PutItemImpl.doPutItem(PutItemImpl.java:85)
13:06:22 	at com.amazonaws.services.dynamodbv2.document.internal.PutItemImpl.putItem(PutItemImpl.java:41)
13:06:22 	at com.amazonaws.services.dynamodbv2.document.Table.putItem(Table.java:151)
13:06:22 	at org.apache.hadoop.fs.s3a.s3guard.DynamoDBMetadataStore.lambda$innerDelete$0(DynamoDBMetadataStore.java:494)
13:06:22 	at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109)
13:06:22 	... 21 more
13:06:22 
{code}

> ITestDynamoDBMetadataStoreScale failing as the error text does not match expectations
> -------------------------------------------------------------------------------------
>
>                 Key: HADOOP-16642
>                 URL: https://issues.apache.org/jira/browse/HADOOP-16642
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3, test
>    Affects Versions: 3.3.0
>            Reporter: Steve Loughran
>            Priority: Major
>
> ITestDynamoDBMetadataStoreScale tries to create a scale test iff the table isn't PAYG. Its failing with the wrong text being returned.
> Proposed: don't look for any text
> {code} 
> 13:06:22 java.lang.AssertionError: 
> 13:06:22 Expected throttling message:  Expected to find ' This may be because the write threshold of DynamoDB is set too low.' 
> but got unexpected exception: org.apache.hadoop.fs.s3a.AWSServiceThrottledException: 
> Put tombstone on s3a://fake-bucket/moved-here: com.amazonaws.services.dynamodbv2.model.ProvisionedThroughputExceededException: 
> The level of configured provisioned throughput for the table was exceeded. 
> Consider increasing your provisioning level with the UpdateTable API. 
> (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ProvisionedThroughputExceededException; 
> Request ID: L12H9UM7PE8K0ILPGGTF4QG367VV4KQNSO5AEMVJF66Q9ASUAAJG): 
> The level of configured provisioned throughput for the table was exceeded. 
> Consider increasing your provisioning level with the UpdateTable API. 
> (Service: AmazonDynamoDBv2; Status Code: 400; 
> Error Code: ProvisionedThroughputExceededException; Request ID: L12H9UM7PE8K0ILPGGTF4QG367VV4KQNSO5AEMVJF66Q9ASUAAJG)
> 13:06:22 	at org.apache.hadoop.fs.s3a.S3AUtils.translateDynamoDBException(S3AUtils.java:402)
> 13
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org