You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by StephanEwen <gi...@git.apache.org> on 2017/05/11 19:26:27 UTC

[GitHub] flink pull request #3876: [FLINK-6514] [build] Create a proper separate Hado...

GitHub user StephanEwen opened a pull request:

    https://github.com/apache/flink/pull/3876

    [FLINK-6514] [build] Create a proper separate Hadoop uber jar for 'flink-dist' assembly

    This fixes the issue that Flink cannot be started locally if built with Maven 3.3+
    
    There are two big fixes in this pull request, because they do not build/pass tests individually. The wrong Mesos dependencies where the reason that the broken Hadoop fat jar building actually passed the Yarn tests.
    
    # Hadoop Uber Jar
    
      - This builds a proper Hadoop Uber Jar with all of Hadoop's needed dependencies. The prior build was missing many important dependencies in the Hadoop Uber Jar.
    
      - The Hadoop-jar is no longer excluded in `flink-dist` via setting the dependency to `provided`, but by explicit exclusion. That way, Hadoop's transitive dependencies are not excluded from other dependencies as well. Before this patch, various decompression and Avro were broken in a Flink build, due to accidental exclusion of their transitive dependencies.
    
    # Dependency fixing
    
      - This also fixes the dependencies of `flink-mesos`, which made all of Hadoop's transitive dependencies its own dependencies, by promoting them during shading. That way, Flink had various unnecessary dependencies in its `flink-dist` jar.
    
      - Incidentally, that brought Hadoop's accidentally excluded dependencies back in, but into the `flink-dist` jar, not the `shaded-hadoop2` jar.
    
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/StephanEwen/incubator-flink fix_fat_jar

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/3876.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #3876
    
----
commit 86803d23dfdc56f2be64274a6a90a76c1e782f08
Author: Stephan Ewen <se...@apache.org>
Date:   2017-05-11T13:12:50Z

    [FLINK-6546] [build] Fix dependencies of flink-mesos
    
      - This makes all flink-related dependencies 'provided' to not have the
        transitive dependencies promoted
    
      - Drops the unnecessary dependency on the Hadoop artifact
    
      - Adds directly referenced libraries, like jackson
    
      - Deactivates default logging of tests

commit a14137c109e73738a3d1f89a3d99e2fd2a799219
Author: Stephan Ewen <se...@apache.org>
Date:   2017-05-11T15:32:03Z

    [build] Reduce flink-avro's compile dependency from 'flink-java' to 'flink-core'

commit 32e8574498d7963e2ab58f1530b41a6853f23601
Author: Stephan Ewen <se...@apache.org>
Date:   2017-05-11T15:35:43Z

    [FLINK-6514] [build] Remove 'flink-shaded-hadoop2' from 'flink-dist' via exclusions
    
    This is more tedious/manual than setting it to 'provided' once, but it
    is also correct.
    
    For example, in the case of Hadoop 2.3, having 'flink-shaded-hadoop2' as 'provided'
    removes other needed dependencies as well, such as 'org.codehaus.jackson' from avro.

commit efaed902a78c6a7f236e0dad4f72ed7ae8bad1c0
Author: Stephan Ewen <se...@apache.org>
Date:   2017-05-11T11:52:25Z

    [FLINK-6514] [build] Merge bin and lib assembly

commit df8efd5cdadaeef9323472f20871871c94d14af5
Author: Stephan Ewen <se...@apache.org>
Date:   2017-05-11T15:00:03Z

    [FLINK-6514] [build] Create a proper separate Hadoop uber jar for 'flink-dist' assembly

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3876: [FLINK-6514] [build] Create a proper separate Hado...

Posted by aljoscha <gi...@git.apache.org>.
Github user aljoscha commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3876#discussion_r116157739
  
    --- Diff: flink-shaded-hadoop/flink-shaded-hadoop2-uber/pom.xml ---
    @@ -0,0 +1,129 @@
    +<?xml version="1.0" encoding="UTF-8"?>
    +<!--
    +Licensed to the Apache Software Foundation (ASF) under one
    +or more contributor license agreements.  See the NOTICE file
    +distributed with this work for additional information
    +regarding copyright ownership.  The ASF licenses this file
    +to you under the Apache License, Version 2.0 (the
    +"License"); you may not use this file except in compliance
    +with the License.  You may obtain a copy of the License at
    +
    +  http://www.apache.org/licenses/LICENSE-2.0
    +
    +Unless required by applicable law or agreed to in writing,
    +software distributed under the License is distributed on an
    +"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
    +KIND, either express or implied.  See the License for the
    +specific language governing permissions and limitations
    +under the License.
    +-->
    +
    +<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    +	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    +
    +	<modelVersion>4.0.0</modelVersion>
    +
    +	<parent>
    +		<groupId>org.apache.flink</groupId>
    +		<artifactId>flink-shaded-hadoop</artifactId>
    +		<version>1.4-SNAPSHOT</version>
    +		<relativePath>..</relativePath>
    +	</parent>
    +
    +	<artifactId>flink-shaded-hadoop2-uber</artifactId>
    +	<name>flink-shaded-hadoop2-uber</name>
    +
    +	<packaging>jar</packaging>
    +
    +	<!--
    +		the only dependency if the 'flink-shaded-hadoop2' artifact, out
    --- End diff --
    
    Typo "only dependency if ..."


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3876: [FLINK-6514] [build] Create a proper separate Hado...

Posted by aljoscha <gi...@git.apache.org>.
Github user aljoscha commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3876#discussion_r116157373
  
    --- Diff: flink-dist/pom.xml ---
    @@ -403,15 +403,6 @@ under the License.
     				</configuration>
     			</plugin>
     
    -			<!-- binary compatibility checks -->
    --- End diff --
    
    This is removed because we have it in all the modules itself?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3876: [FLINK-6514] [build] Create a proper separate Hadoop uber...

Posted by rmetzger <gi...@git.apache.org>.
Github user rmetzger commented on the issue:

    https://github.com/apache/flink/pull/3876
  
    +1 to merge


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3876: [FLINK-6514] [build] Create a proper separate Hado...

Posted by rmetzger <gi...@git.apache.org>.
Github user rmetzger commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3876#discussion_r116190819
  
    --- Diff: flink-dist/pom.xml ---
    @@ -403,15 +403,6 @@ under the License.
     				</configuration>
     			</plugin>
     
    -			<!-- binary compatibility checks -->
    --- End diff --
    
    I think this is an artifact from the time when the checker was defined in the root pom and we disabled it for all modules that don't need checking.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3876: [FLINK-6514] [build] Create a proper separate Hado...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/flink/pull/3876


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3876: [FLINK-6514] [build] Create a proper separate Hadoop uber...

Posted by rmetzger <gi...@git.apache.org>.
Github user rmetzger commented on the issue:

    https://github.com/apache/flink/pull/3876
  
    I'm merging this change now


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3876: [FLINK-6514] [build] Create a proper separate Hadoop uber...

Posted by rmetzger <gi...@git.apache.org>.
Github user rmetzger commented on the issue:

    https://github.com/apache/flink/pull/3876
  
    The change looks good for me.
    I'll verify it with my Cloudera VM to make sure it works with provided hadoop versions as well.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3876: [FLINK-6514] [build] Create a proper separate Hadoop uber...

Posted by rmetzger <gi...@git.apache.org>.
Github user rmetzger commented on the issue:

    https://github.com/apache/flink/pull/3876
  
    This PR works in my VM. There's just some weird behavior when starting Flink on YARN.
    I'll cross check with AWS EMR to see if its a YARN or a VM/Cloudera issue.
    
    I wanted to check on EMR anyways for the release :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---