You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by zentol <gi...@git.apache.org> on 2018/05/30 11:19:30 UTC

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

GitHub user zentol opened a pull request:

    https://github.com/apache/flink/pull/6102

    [FLINK-9091][build] Dependency convergence run against dependency-red…

    ## What is the purpose of the change
    
    As outlined in the JIRA the enforcer-plugin may not always work against dependency-reduced poms leading to convergence failures. This heavily conflicts with our approach to shading dependencies. 
    
    This PR ensures that the dependency-convergence , at least on travis, is run against dependency-reduced poms.
    
    The easiest way to achieve this is to run the enforcer plugin separately for each leaf module, for which maven works against the local repository, and thus dependency-reduced poms.
    Naturally this cannot be nicely modeled into the maven life-cycle, hence we rely on a bash script to iterate over all modules.
    
    To prevent other devs from running into this problem the enforcer plugin is now disabled by default. It must be explicitly enabled with a profile. To check the convergence locally one may use `tools/check_dependency_convergence.sh`.
    
    This PR subsumes #6073.
    
    ## Brief change log
    
    * disable enforcer by default
    * add profile to enable enforcer
    * add `tools/check_dependency_convergence.sh` script to run convergence check against dependency-reduced poms
    * modify travis scripts to run convergence check after compilation
    
    ## Verifying this change
    
    * add a dependency-convergence violation and run on travis; optionally run script locally


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/zentol/flink 9091

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/6102.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #6102
    
----
commit 6f2eb68d785b88de36d7eb6e0d48d487e8a7a811
Author: zentol <ch...@...>
Date:   2018-05-28T13:10:53Z

    [FLINK-9091][build] Dependency convergence run against dependency-reduced poms

----


---

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/6102#discussion_r199135922
  
    --- Diff: tools/check_dependency_convergence.sh ---
    @@ -0,0 +1,67 @@
    +#!/usr/bin/env bash
    +################################################################################
    +#  Licensed to the Apache Software Foundation (ASF) under one
    +#  or more contributor license agreements.  See the NOTICE file
    +#  distributed with this work for additional information
    +#  regarding copyright ownership.  The ASF licenses this file
    +#  to you under the Apache License, Version 2.0 (the
    +#  "License"); you may not use this file except in compliance
    +#  with the License.  You may obtain a copy of the License at
    +#
    +#      http://www.apache.org/licenses/LICENSE-2.0
    +#
    +#  Unless required by applicable law or agreed to in writing, software
    +#  distributed under the License is distributed on an "AS IS" BASIS,
    +#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +#  See the License for the specific language governing permissions and
    +# limitations under the License.
    +################################################################################
    +
    +HERE="`dirname \"$0\"`"				# relative
    +HERE="`( cd \"$HERE\" && pwd )`" 	# absolutized and normalized
    +if [ -z "$HERE" ] ; then
    +	# error; for some reason, the path is not accessible
    +	# to the script (e.g. permissions re-evaled after suid)
    +	exit 1  # fail
    +fi
    +
    +FLINK_DIR=HERE
    +
    +if [[ $(basename ${HERE}) == "tools" ]] ; then
    +  FLINK_DIR="${HERE}/.."
    +fi
    +
    +FLINK_DIR="`( cd \"${FLINK_DIR}\" && pwd )`" 
    +
    +echo ${FLINK_DIR}
    +
    +# get list of all flink modules
    +# searches for directories containing a pom.xml file
    +# sorts the list alphabetically
    +# only accepts directories starting with "flink" to filter force-shading
    +modules=$(find -maxdepth 3 -name 'pom.xml' -printf '%h\n' | sort -u | grep "flink")
    +
    +for module in ${modules}
    +do
    +    # we are only interested in child modules
    +    for other_module in ${modules}
    +    do 
    +        if [[ "${other_module}" != "${module}" && "${other_module}" = "${module}"/* ]]; then
    +        echo "excluding ${module} since it is not a leaf module"
    +            continue 2
    +        fi
    +    done
    +    
    +    cd "${module}"
    +    echo "checking ${module}"
    +    output=$(mvn validate -nsu -Dcheckstyle.skip=true -Dcheck-convergence)
    --- End diff --
    
    Do we need to skip scala style and the rat plugin for speed up here?


---

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/flink/pull/6102


---

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

Posted by zentol <gi...@git.apache.org>.
Github user zentol commented on a diff in the pull request:

    https://github.com/apache/flink/pull/6102#discussion_r199776329
  
    --- Diff: tools/check_dependency_convergence.sh ---
    @@ -0,0 +1,67 @@
    +#!/usr/bin/env bash
    +################################################################################
    +#  Licensed to the Apache Software Foundation (ASF) under one
    +#  or more contributor license agreements.  See the NOTICE file
    +#  distributed with this work for additional information
    +#  regarding copyright ownership.  The ASF licenses this file
    +#  to you under the Apache License, Version 2.0 (the
    +#  "License"); you may not use this file except in compliance
    +#  with the License.  You may obtain a copy of the License at
    +#
    +#      http://www.apache.org/licenses/LICENSE-2.0
    +#
    +#  Unless required by applicable law or agreed to in writing, software
    +#  distributed under the License is distributed on an "AS IS" BASIS,
    +#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +#  See the License for the specific language governing permissions and
    +# limitations under the License.
    +################################################################################
    +
    +HERE="`dirname \"$0\"`"				# relative
    +HERE="`( cd \"$HERE\" && pwd )`" 	# absolutized and normalized
    +if [ -z "$HERE" ] ; then
    +	# error; for some reason, the path is not accessible
    +	# to the script (e.g. permissions re-evaled after suid)
    +	exit 1  # fail
    +fi
    +
    +FLINK_DIR=HERE
    +
    +if [[ $(basename ${HERE}) == "tools" ]] ; then
    +  FLINK_DIR="${HERE}/.."
    +fi
    +
    +FLINK_DIR="`( cd \"${FLINK_DIR}\" && pwd )`" 
    +
    +echo ${FLINK_DIR}
    +
    +# get list of all flink modules
    +# searches for directories containing a pom.xml file
    +# sorts the list alphabetically
    +# only accepts directories starting with "flink" to filter force-shading
    +modules=$(find -maxdepth 3 -name 'pom.xml' -printf '%h\n' | sort -u | grep "flink")
    +
    +for module in ${modules}
    +do
    +    # we are only interested in child modules
    +    for other_module in ${modules}
    +    do 
    +        if [[ "${other_module}" != "${module}" && "${other_module}" = "${module}"/* ]]; then
    +        echo "excluding ${module} since it is not a leaf module"
    +            continue 2
    +        fi
    +    done
    +    
    +    cd "${module}"
    +    echo "checking ${module}"
    +    output=$(mvn validate -nsu -Dcheckstyle.skip=true -Dcheck-convergence)
    --- End diff --
    
    both plugins are not executed in the `validate` phase.


---

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

Posted by zentol <gi...@git.apache.org>.
Github user zentol commented on a diff in the pull request:

    https://github.com/apache/flink/pull/6102#discussion_r199776816
  
    --- Diff: tools/check_dependency_convergence.sh ---
    @@ -0,0 +1,67 @@
    +#!/usr/bin/env bash
    +################################################################################
    +#  Licensed to the Apache Software Foundation (ASF) under one
    +#  or more contributor license agreements.  See the NOTICE file
    +#  distributed with this work for additional information
    +#  regarding copyright ownership.  The ASF licenses this file
    +#  to you under the Apache License, Version 2.0 (the
    +#  "License"); you may not use this file except in compliance
    +#  with the License.  You may obtain a copy of the License at
    +#
    +#      http://www.apache.org/licenses/LICENSE-2.0
    +#
    +#  Unless required by applicable law or agreed to in writing, software
    +#  distributed under the License is distributed on an "AS IS" BASIS,
    +#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +#  See the License for the specific language governing permissions and
    +# limitations under the License.
    +################################################################################
    +
    +HERE="`dirname \"$0\"`"				# relative
    +HERE="`( cd \"$HERE\" && pwd )`" 	# absolutized and normalized
    +if [ -z "$HERE" ] ; then
    +	# error; for some reason, the path is not accessible
    +	# to the script (e.g. permissions re-evaled after suid)
    +	exit 1  # fail
    +fi
    +
    +FLINK_DIR=HERE
    +
    +if [[ $(basename ${HERE}) == "tools" ]] ; then
    +  FLINK_DIR="${HERE}/.."
    +fi
    +
    +FLINK_DIR="`( cd \"${FLINK_DIR}\" && pwd )`" 
    +
    +echo ${FLINK_DIR}
    +
    +# get list of all flink modules
    +# searches for directories containing a pom.xml file
    +# sorts the list alphabetically
    +# only accepts directories starting with "flink" to filter force-shading
    +modules=$(find -maxdepth 3 -name 'pom.xml' -printf '%h\n' | sort -u | grep "flink")
    +
    +for module in ${modules}
    +do
    +    # we are only interested in child modules
    +    for other_module in ${modules}
    +    do 
    +        if [[ "${other_module}" != "${module}" && "${other_module}" = "${module}"/* ]]; then
    +        echo "excluding ${module} since it is not a leaf module"
    +            continue 2
    +        fi
    +    done
    +    
    +    cd "${module}"
    +    echo "checking ${module}"
    +    output=$(mvn validate -nsu -Dcheckstyle.skip=true -Dcheck-convergence)
    --- End diff --
    
    We explicitly run `rat` in the `verify`, which is also the one `scalastyle` runs in by default. 


---

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/6102#discussion_r199123464
  
    --- Diff: tools/travis_mvn_watchdog.sh ---
    @@ -533,6 +537,12 @@ esac
     # Run tests if compilation was successful
     if [ $EXIT_CODE == 0 ]; then
     
    +    # Start watching $MVN_OUT
    +    watchdog &
    --- End diff --
    
    By the way: Should we automate the lists for `MODULES_CONNECTORS` and `MODULES_LIBRARIES`? They are not up-to-date e.g. `flink-json`, `flink-sql-client`, `flink-orc` are missing.


---

[GitHub] flink issue #6102: [FLINK-9091][build] Dependency convergence run against de...

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on the issue:

    https://github.com/apache/flink/pull/6102
  
    Thanks for the update @zentol. So do we want to support this script also on MacOS or not? Otherwise it is good to merge. What is your opinion about the static `MODULES_CONNECTORS` and `MODULES_LIBRARIES` should we create a separate issue for it?


---

[GitHub] flink issue #6102: [FLINK-9091][build] Dependency convergence run against de...

Posted by zentol <gi...@git.apache.org>.
Github user zentol commented on the issue:

    https://github.com/apache/flink/pull/6102
  
    I'll merge it as is and create a follow-up for making the `check-convergence` script run on mac.


---

[GitHub] flink issue #6102: [FLINK-9091][build] Dependency convergence run against de...

Posted by zentol <gi...@git.apache.org>.
Github user zentol commented on the issue:

    https://github.com/apache/flink/pull/6102
  
    We should make sure it runs on mac, I just haven't had the time to search for a cross-platform solution.
    I also wouldn't be able to verify that it actually runs on mac, so someone else may have to do that.
    
    I'm rather hesitant about generating the lists automatically. We could only generate part of them anyway (since contrib is split among them) and partially generating them is messy. What I would prefer is failing the build if a library/connector doesn't show up in the list to create more awareness for the whole thing.


---

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/6102#discussion_r199122374
  
    --- Diff: tools/travis_mvn_watchdog.sh ---
    @@ -533,6 +537,12 @@ esac
     # Run tests if compilation was successful
     if [ $EXIT_CODE == 0 ]; then
     
    +    # Start watching $MVN_OUT
    +    watchdog &
    --- End diff --
    
    Wrong indention used in this file. 


---

[GitHub] flink pull request #6102: [FLINK-9091][build] Dependency convergence run aga...

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/6102#discussion_r199125903
  
    --- Diff: tools/check_dependency_convergence.sh ---
    @@ -0,0 +1,67 @@
    +#!/usr/bin/env bash
    +################################################################################
    +#  Licensed to the Apache Software Foundation (ASF) under one
    +#  or more contributor license agreements.  See the NOTICE file
    +#  distributed with this work for additional information
    +#  regarding copyright ownership.  The ASF licenses this file
    +#  to you under the Apache License, Version 2.0 (the
    +#  "License"); you may not use this file except in compliance
    +#  with the License.  You may obtain a copy of the License at
    +#
    +#      http://www.apache.org/licenses/LICENSE-2.0
    +#
    +#  Unless required by applicable law or agreed to in writing, software
    +#  distributed under the License is distributed on an "AS IS" BASIS,
    +#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +#  See the License for the specific language governing permissions and
    +# limitations under the License.
    +################################################################################
    +
    +HERE="`dirname \"$0\"`"				# relative
    +HERE="`( cd \"$HERE\" && pwd )`" 	# absolutized and normalized
    +if [ -z "$HERE" ] ; then
    +	# error; for some reason, the path is not accessible
    +	# to the script (e.g. permissions re-evaled after suid)
    +	exit 1  # fail
    +fi
    +
    +FLINK_DIR=HERE
    +
    +if [[ $(basename ${HERE}) == "tools" ]] ; then
    +  FLINK_DIR="${HERE}/.."
    +fi
    +
    +FLINK_DIR="`( cd \"${FLINK_DIR}\" && pwd )`" 
    +
    +echo ${FLINK_DIR}
    +
    +# get list of all flink modules
    +# searches for directories containing a pom.xml file
    +# sorts the list alphabetically
    +# only accepts directories starting with "flink" to filter force-shading
    +modules=$(find -maxdepth 3 -name 'pom.xml' -printf '%h\n' | sort -u | grep "flink")
    --- End diff --
    
    Specify the path otherwise is not runnable in every environment.
    
    `printf` is also not available in MacOs:
    https://stackoverflow.com/questions/752818/find-lacks-the-option-printf-now-what


---