You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Till Rohrmann (JIRA)" <ji...@apache.org> on 2018/10/31 13:26:00 UTC

[jira] [Updated] (FLINK-10736) Shaded Hadoop S3A end-to-end test failed on Travis

     [ https://issues.apache.org/jira/browse/FLINK-10736?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Till Rohrmann updated FLINK-10736:
----------------------------------
    Description: 
The {{Shaded Hadoop S3A end-to-end test}} failed on Travis because it could not find a file stored on S3:
{code}
org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: f28270bedd943ed6b41548b60f5cea73)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:487)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:475)
	at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
	at org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:85)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
	at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
	at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
	at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
	at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
	at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
	at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
	at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
	... 21 more
Caused by: java.io.IOException: Error opening the Input Split s3://[secure]/flink-end-to-end-test-shaded-s3a [0,44]: No such file or directory: s3://[secure]/flink-end-to-end-test-shaded-s3a
	at org.apache.flink.api.common.io.FileInputFormat.open(FileInputFormat.java:824)
	at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:470)
	at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:47)
	at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:170)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.FileNotFoundException: No such file or directory: s3://[secure]/flink-end-to-end-test-shaded-s3a
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2255)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2149)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2088)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.open(S3AFileSystem.java:699)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.FileSystem.open(FileSystem.java:950)
	at org.apache.flink.fs.s3.common.hadoop.HadoopFileSystem.open(HadoopFileSystem.java:120)
	at org.apache.flink.fs.s3.common.hadoop.HadoopFileSystem.open(HadoopFileSystem.java:37)
	at org.apache.flink.api.common.io.FileInputFormat$InputSplitOpenThread.run(FileInputFormat.java:996)
{code}

https://api.travis-ci.org/v3/job/448770093/log.txt

The solution could to harden this test case.

  was:
The {{Shaded Hadoop S3A end-to-end test}} failed on Travis because it could not find a file stored on S3:
{code}
org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: f28270bedd943ed6b41548b60f5cea73)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:487)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:475)
	at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
	at org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:85)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
	at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
	at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
	at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
	at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
	at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
	at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
	at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
	... 21 more
Caused by: java.io.IOException: Error opening the Input Split s3://[secure]/flink-end-to-end-test-shaded-s3a [0,44]: No such file or directory: s3://[secure]/flink-end-to-end-test-shaded-s3a
	at org.apache.flink.api.common.io.FileInputFormat.open(FileInputFormat.java:824)
	at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:470)
	at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:47)
	at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:170)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.FileNotFoundException: No such file or directory: s3://[secure]/flink-end-to-end-test-shaded-s3a
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2255)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2149)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2088)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.open(S3AFileSystem.java:699)
	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.FileSystem.open(FileSystem.java:950)
	at org.apache.flink.fs.s3.common.hadoop.HadoopFileSystem.open(HadoopFileSystem.java:120)
	at org.apache.flink.fs.s3.common.hadoop.HadoopFileSystem.open(HadoopFileSystem.java:37)
	at org.apache.flink.api.common.io.FileInputFormat$InputSplitOpenThread.run(FileInputFormat.java:996)
{code}

https://api.travis-ci.org/v3/job/448770093/log.txt


> Shaded Hadoop S3A end-to-end test failed on Travis
> --------------------------------------------------
>
>                 Key: FLINK-10736
>                 URL: https://issues.apache.org/jira/browse/FLINK-10736
>             Project: Flink
>          Issue Type: Bug
>          Components: E2E Tests
>    Affects Versions: 1.7.0
>            Reporter: Till Rohrmann
>            Priority: Critical
>              Labels: test-stability
>             Fix For: 1.7.0
>
>
> The {{Shaded Hadoop S3A end-to-end test}} failed on Travis because it could not find a file stored on S3:
> {code}
> org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: f28270bedd943ed6b41548b60f5cea73)
> 	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268)
> 	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:487)
> 	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:475)
> 	at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
> 	at org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:85)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
> 	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
> 	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
> 	at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
> 	at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
> 	at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
> 	at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
> 	at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
> 	at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> 	at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
> Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
> 	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
> 	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
> 	... 21 more
> Caused by: java.io.IOException: Error opening the Input Split s3://[secure]/flink-end-to-end-test-shaded-s3a [0,44]: No such file or directory: s3://[secure]/flink-end-to-end-test-shaded-s3a
> 	at org.apache.flink.api.common.io.FileInputFormat.open(FileInputFormat.java:824)
> 	at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:470)
> 	at org.apache.flink.api.common.io.DelimitedInputFormat.open(DelimitedInputFormat.java:47)
> 	at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:170)
> 	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.io.FileNotFoundException: No such file or directory: s3://[secure]/flink-end-to-end-test-shaded-s3a
> 	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2255)
> 	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2149)
> 	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2088)
> 	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.open(S3AFileSystem.java:699)
> 	at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.FileSystem.open(FileSystem.java:950)
> 	at org.apache.flink.fs.s3.common.hadoop.HadoopFileSystem.open(HadoopFileSystem.java:120)
> 	at org.apache.flink.fs.s3.common.hadoop.HadoopFileSystem.open(HadoopFileSystem.java:37)
> 	at org.apache.flink.api.common.io.FileInputFormat$InputSplitOpenThread.run(FileInputFormat.java:996)
> {code}
> https://api.travis-ci.org/v3/job/448770093/log.txt
> The solution could to harden this test case.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)