You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by zentol <gi...@git.apache.org> on 2016/02/24 11:28:44 UTC

[GitHub] flink pull request: [FLINK-3491] Prevent failure of HDFSCopyUtilit...

GitHub user zentol opened a pull request:

    https://github.com/apache/flink/pull/1703

    [FLINK-3491] Prevent failure of HDFSCopyUtilitiesTest on Windows

    This PR contains two commits to that prevent this test from failing on Windows.
    
    The first commit resolves the problem raised in the JIRA: it changes how the URI is generated, using new Path(file).toUri() instead of file.toUri(), since the latter fails for Windows paths.
    
    After this change, i got a new exception when running the test:
    ```
    testCopyFromLocal(org.apache.flink.streaming.util.HDFSCopyUtilitiesTest)  Time elapsed: 1.892 sec  <<< ERROR!
    java.lang.NullPointerException: null
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
            at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
            at org.apache.hadoop.util.Shell.run(Shell.java:418)
            at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
            at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
            at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
            at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:631)
            at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
            at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
            at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
            at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
            at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:888)
            at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:785)
            at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:365)
            at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
            at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
            at org.apache.hadoop.fs.LocalFileSystem.copyFromLocalFile(LocalFileSystem.java:82)
            at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1837)
            at org.apache.flink.streaming.util.HDFSCopyFromLocal$1.run(HDFSCopyFromLocal.java:49)
    
    ```
    
    After googling a bit it appears that several hadoop versions can't deal with windows paths unless it has access to fancy libraries/dll's. (see https://issues.apache.org/jira/browse/SPARK-6961) As such I added a  second commit that disables the test when run on windows. 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/zentol/flink 3491_test_hdfscopy

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/1703.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1703
    
----
commit 124da4220968ce331ebd9c45cdf35e4a4074848b
Author: zentol <s....@web.de>
Date:   2016-02-24T10:14:23Z

    [FLINK-3491] Prevent URIException in HDFSCopyTest

commit c458f305ad23f9f075993133ffcf5ed5dd606eb4
Author: zentol <s....@web.de>
Date:   2016-02-24T10:14:52Z

    Disable HDFSCopyUtilitiesTest on Windows

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request: [FLINK-3491] Prevent failure of HDFSCopyUtilit...

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on the pull request:

    https://github.com/apache/flink/pull/1703#issuecomment-189260386
  
    Looks good, +1 to merge


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request: [FLINK-3491] Prevent failure of HDFSCopyUtilit...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/flink/pull/1703


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request: [FLINK-3491] Prevent failure of HDFSCopyUtilit...

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on the pull request:

    https://github.com/apache/flink/pull/1703#issuecomment-192363616
  
    Will merge this...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request: [FLINK-3491] Prevent failure of HDFSCopyUtilit...

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on a diff in the pull request:

    https://github.com/apache/flink/pull/1703#discussion_r53994155
  
    --- Diff: flink-streaming-java/src/test/java/org/apache/flink/streaming/util/HDFSCopyUtilitiesTest.java ---
    @@ -35,6 +37,11 @@
     	@Rule
     	public TemporaryFolder tempFolder = new TemporaryFolder();
     
    +	@Before
    +	public void checkOperatingSystem() {
    +		Assume.assumeTrue("This test can't run successfully on Windows.", !System.getProperty("os.name").startsWith("Windows"));
    --- End diff --
    
    You can use `OperatingSystem.isWindows()`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request: [FLINK-3491] Prevent failure of HDFSCopyUtilit...

Posted by aljoscha <gi...@git.apache.org>.
Github user aljoscha commented on the pull request:

    https://github.com/apache/flink/pull/1703#issuecomment-188204348
  
    +1, looks good to merge


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---