You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by Till Rohrmann <tr...@apache.org> on 2018/08/07 15:17:24 UTC

[VOTE] Release 1.6.0, release candidate #4

Hi everyone,
Please review and vote on the release candidate #4 for the version 1.6.0,
as follows:
[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific comments)


The complete staging area is available for your review, which includes:
* JIRA release notes [1],
* the official Apache source release and binary convenience releases to be
deployed to dist.apache.org [2], which are signed with the key with
fingerprint 1F302569A96CFFD5 [3],
* all artifacts to be deployed to the Maven Central Repository [4],
* source code tag "release-1.6.0-rc4" [5],
* website pull request listing the new release and adding announcement blog
post [6].

Please use this document for coordinating testing efforts: [7]

The vote will be shortened since we only adde a minor fix on top of the RC
3. It will close on Wednesday 6:30pm CET. It is adopted by majority
approval, with at least 3 PMC affirmative votes.

Thanks,
Your friendly Release Manager

[1]
https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
[2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
[3] https://dist.apache.org/repos/dist/release/flink/KEYS
[4] https://repository.apache.org/content/repositories/orgapacheflink-1178
[5] https://github.com/apache/flink/tree/release-1.6.0-rc4
[6] https://github.com/apache/flink-web/pull/117
[7]
https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing

Pro-tip: you can create a settings.xml file with these contents:

<settings>
<activeProfiles>
  <activeProfile>flink-1.6.0</activeProfile>
</activeProfiles>
<profiles>
  <profile>
    <id>flink-1.6.0</id>
    <repositories>
      <repository>
        <id>flink-1.6.0</id>
        <url>

https://repository.apache.org/content/repositories/orgapacheflink-1178/
        </url>
      </repository>
      <repository>
        <id>archetype</id>
        <url>

https://repository.apache.org/content/repositories/orgapacheflink-1178/
        </url>
      </repository>
    </repositories>
  </profile>
</profiles>
</settings>

And reference that in you maven commands via --settings
path/to/settings.xml. This is useful for creating a quickstart based on the
staged release and for building against the staged jars.

Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Yaz Sh <ya...@gmail.com>.
Thanks for the fix!

/Yaz

On Thu, Aug 9, 2018 at 9:10 AM Till Rohrmann <tr...@apache.org> wrote:

> Thanks for reporting this problem Yaz. I just pushed a commit which should
> update the links accordingly once the Flink documentation gets rebuilt
> (over night). Tomorrow it should be fixed.
>
> Cheers,
> Till
>
> On Thu, Aug 9, 2018 at 2:53 PM Yaz Sh <ya...@gmail.com> wrote:
>
> > Great Job on Release 1.6!
> >
> > I just checked it out and still I can see v.1.6-SNAPSHOT on the title of
> > https://ci.apache.org/projects/flink/flink-docs-release-1.6/ <
> > https://ci.apache.org/projects/flink/flink-docs-release-1.6/>
> >
> > and when I click on any options, it redirects me to master docs
> > 1.7-SNAPSHOT.
> >
> > I opened this ticket https://issues.apache.org/jira/browse/FLINK-10112 <
> > https://issues.apache.org/jira/browse/FLINK-10112>
> >
> > Also I don’t see v1.6 on “Pick Docs Version" drop down
> >
> > Cheers,
> > Yaz
> >
> > > On Aug 8, 2018, at 3:24 PM, Timo Walther <tw...@apache.org> wrote:
> > >
> > > +1
> > >
> > > - successfully run `mvn clean verify` locally
> > > - successfully run end-to-end tests locally (except for SQL Client
> > end-to-end test)
> > >
> > > Found a bug in the class loading of SQL JAR files. This is not a
> blocker
> > but a bug that we should fix soon. As an easy workaround user should not
> > use different Kafka versions as SQL Client dependencies.
> > >
> > > Regards,
> > > Timo
> > >
> > > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> > >> +1
> > >>
> > >> - verified compilation, tests
> > >> - verified checksum and gpg files
> > >> - verified sbt templates (g8, quickstart) - run assemblies on local
> > cluster
> > >>
> > >> - I could not execute the nightly-tests.sh though. The tests that were
> > >> failing most often are:
> > >>     - test_streaming_file_sink.sh
> > >>     - test_streaming_elasticsearch.sh
> > >>
> > >> Those are connectors though and it might be only tests flakiness so I
> > >> think it should not block the release.
> > >>
> > >> On 08/08/18 16:36, Chesnay Schepler wrote:
> > >>> I did not use the tools/list_deps.py script as I wasn't aware that it
> > >>> existed.
> > >>>
> > >>> Even if I were I wouldn't have used it and in fact would advocate for
> > >>> removing it.
> > >>> It manually parses and constructs dependency information which is
> > >>> utterly unnecessary as maven already provides this functionality,
> with
> > >>> the added bonus of also accounting for dependencyManagement and
> > >>> transitive dependencies which we obviously have to take into account.
> > >>>
> > >>> I used this one-liner instead:
> > >>> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> > >>> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> > >>> |'s/:[a-z]*$//g'| || ||sort| |-u
> > >>>
> > >>> |which I have documented here:
> > >>> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
> > >>>
> > >>> On 08.08.2018 15:06, Aljoscha Krettek wrote:
> > >>>> +1
> > >>>>
> > >>>> - verified checksum and gpg files
> > >>>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5,
> LICENSE
> > >>>> had one unnecessary part removed
> > >>>>
> > >>>> Side comment: I'm not sure whether the "Verify that the LICENSE and
> > >>>> NOTICE file is correct for the binary and source releases" part is
> > >>>> valid anymore because we only have one LICENSE and NOTICE file. also
> > >>>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
> > >>>> to the binary distribution and mention all of Flink's Maven
> > >>>> dependencies as well" can be dropped because we don't have them
> > anymore.
> > >>>>
> > >>>> I came to the same conclusion on dependencies. I used
> > >>>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
> > >>>> probably what Chesnay also did ... :-)
> > >>>>
> > >>>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org>
> > wrote:
> > >>>>>
> > >>>>> +1
> > >>>>>
> > >>>>> - verified source release contains no binaries
> > >>>>> - verified correct versions in source release
> > >>>>> - verified compilation, tests and E2E-tests pass (on travis)
> > >>>>> - verified checksum and gpg files
> > >>>>>
> > >>>>> New dependencies (excluding dependencies where we simply depend on
> a
> > >>>>> different version now):
> > >>>>>     Apache licensed:
> > >>>>>         io.confluent:common-utils:jar:3.3.1
> > >>>>>         io.confluent:kafka-schema-registry-client:jar:3.3.1
> > >>>>>         io.prometheus:simpleclient_pushgateway:jar:0.3.0
> > >>>>>         various Apache Nifi dependencies
> > >>>>>         various Apache Parquet dependencies
> > >>>>>         various ElasticSearch dependencies
> > >>>>>     CDDL:
> > >>>>>         javax.ws.rs:javax.ws.rs-api:jar:2.1
> > >>>>>     Bouncycastle (MIT-like):
> > >>>>>         org.bouncycastle:bcpkix-jdk15on:jar:1.59
> > >>>>>         org.bouncycastle:bcprov-jdk15on:jar:1.59
> > >>>>>     MIT:
> > >>>>>         org.projectlombok:lombok:jar:1.16.20
> > >>>>>
> > >>>>> On 08.08.2018 13:28, Till Rohrmann wrote:
> > >>>>>> Thanks for reporting these problems Chesnay. The usage string in
> > >>>>>> `standalone-job.sh` is out dated and should be updated. The same
> > >>>>>> applies to
> > >>>>>> the typo.
> > >>>>>>
> > >>>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
> > >>>>>> please
> > >>>>>> make sure that the user code jar is contained in the classpath
> (e.g.
> > >>>>>> putting the jar in the lib directory). Documenting this behaviour
> > >>>>>> is part
> > >>>>>> of the pending issue FLINK-10001.
> > >>>>>>
> > >>>>>> We should fix all of these issues. They are, however, no release
> > >>>>>> blockers.
> > >>>>>>
> > >>>>>> Cheers,
> > >>>>>> Till
> > >>>>>>
> > >>>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
> > >>>>>> <ch...@apache.org> wrote:
> > >>>>>>
> > >>>>>>> I found some issues with the standalone-job.sh script.
> > >>>>>>>
> > >>>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
> > >>>>>>> string.
> > >>>>>>>
> > >>>>>>>      2018-08-08 09:22:34,385 ERROR
> > >>>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint
> >  -
> > >>>>>>>      Could not parse command line arguments [--configDir,
> > >>>>>>>      /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
> > >>>>>>>      org.apache.flink.runtime.entrypoint.FlinkParseException:
> > >>>>>>> Failed to
> > >>>>>>>      parse the command line arguments.
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
> > >>>>>>>
> > >>>>>>>      Caused by: org.apache.commons.cli.MissingOptionException:
> > >>>>>>> Missing
> > >>>>>>>      required option: j
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> >  org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
> > >>>>>>>               at
> > >>>>>>>
> >  org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
> > >>>>>>>
> > >>>>>>>              ... 1 more
> > >>>>>>>
> > >>>>>>> The script should fail earlier if no jar is provided, with a
> better
> > >>>>>>> error message.
> > >>>>>>> It is also undocumented, and the usage instructions don't appear
> > >>>>>>> correct.
> > >>>>>>>
> > >>>>>>> Passing a jar with the -j option leads to a
> ClassNotFoundException:
> > >>>>>>> "./bin/standalone-job.sh start -j
> examples/streaming/WordCount.jar"
> > >>>>>>>
> > >>>>>>>      2018-08-08 09:26:30,562 ERROR
> > >>>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint
> >  -
> > >>>>>>>      Cluster initialization failed.
> > >>>>>>>      java.lang.reflect.UndeclaredThrowableException
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
> > >>>>>>>
> > >>>>>>>      Caused by: org.apache.flink.util.FlinkException: Could not
> > >>>>>>> load the
> > >>>>>>>      provied entrypoint class.
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
> > >>>>>>>
> > >>>>>>>               at
> > >>>>>>> java.security.AccessController.doPrivileged(Native Method)
> > >>>>>>>               at
> javax.security.auth.Subject.doAs(Subject.java:422)
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
> > >>>>>>>
> > >>>>>>>               ... 3 more
> > >>>>>>>      Caused by: java.lang.ClassNotFoundException:
> > >>>>>>>      examples/streaming/WordCount.jar
> > >>>>>>>               at
> > >>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> > >>>>>>>               at
> > >>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> > >>>>>>>               at
> > >>>>>>>
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
> > >>>>>>>               at
> > >>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> > >>>>>>>               at
> > >>>>>>>
> > >>>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
> > >>>>>>>
> > >>>>>>>               ... 11 more
> > >>>>>>>
> > >>>>>>> So this seems to not work at all, but maybe I'm using it wrong?
> > >>>>>>>
> > >>>>>>> (There's also typo in "Could not load the provied entrypoint
> > class")
> > >>>>>>>
> > >>>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
> > >>>>>>>> +1 from my side
> > >>>>>>>>
> > >>>>>>>> I’ve spent some time playing around with various examples
> > (batching,
> > >>>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment,
> > with
> > >>>>>>> different configuration options (number of task
> > >>>>>>> managers/memory/Flip6/credit base flow control/metrics) and
> > >>>>>>> everything
> > >>>>>>> looks now fine (after fixing
> > >>>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
> > >>>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
> > >>>>>>>> Piotrek
> > >>>>>>>>
> > >>>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
> > >>>>>>>>> wrote:
> > >>>>>>>>>
> > >>>>>>>>> Hi everyone,
> > >>>>>>>>> Please review and vote on the release candidate #4 for the
> > version
> > >>>>>>> 1.6.0,
> > >>>>>>>>> as follows:
> > >>>>>>>>> [ ] +1, Approve the release
> > >>>>>>>>> [ ] -1, Do not approve the release (please provide specific
> > >>>>>>>>> comments)
> > >>>>>>>>>
> > >>>>>>>>>
> > >>>>>>>>> The complete staging area is available for your review, which
> > >>>>>>>>> includes:
> > >>>>>>>>> * JIRA release notes [1],
> > >>>>>>>>> * the official Apache source release and binary convenience
> > >>>>>>>>> releases to
> > >>>>>>> be
> > >>>>>>>>> deployed to dist.apache.org [2], which are signed with the key
> > with
> > >>>>>>>>> fingerprint 1F302569A96CFFD5 [3],
> > >>>>>>>>> * all artifacts to be deployed to the Maven Central Repository
> > [4],
> > >>>>>>>>> * source code tag "release-1.6.0-rc4" [5],
> > >>>>>>>>> * website pull request listing the new release and adding
> > >>>>>>>>> announcement
> > >>>>>>> blog
> > >>>>>>>>> post [6].
> > >>>>>>>>>
> > >>>>>>>>> Please use this document for coordinating testing efforts: [7]
> > >>>>>>>>>
> > >>>>>>>>> The vote will be shortened since we only adde a minor fix on
> top
> > >>>>>>>>> of the
> > >>>>>>> RC
> > >>>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by
> > majority
> > >>>>>>>>> approval, with at least 3 PMC affirmative votes.
> > >>>>>>>>>
> > >>>>>>>>> Thanks,
> > >>>>>>>>> Your friendly Release Manager
> > >>>>>>>>>
> > >>>>>>>>> [1]
> > >>>>>>>>>
> > >>>>>>>
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
> > >>>>>>>
> > >>>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
> > >>>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > >>>>>>>>> [4]
> > >>>>>>>
> > https://repository.apache.org/content/repositories/orgapacheflink-1178
> > >>>>>>>
> > >>>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
> > >>>>>>>>> [6] https://github.com/apache/flink-web/pull/117
> > >>>>>>>>> [7]
> > >>>>>>>>>
> > >>>>>>>
> >
> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
> > >>>>>>>
> > >>>>>>>>> Pro-tip: you can create a settings.xml file with these
> contents:
> > >>>>>>>>>
> > >>>>>>>>> <settings>
> > >>>>>>>>> <activeProfiles>
> > >>>>>>>>>    <activeProfile>flink-1.6.0</activeProfile>
> > >>>>>>>>> </activeProfiles>
> > >>>>>>>>> <profiles>
> > >>>>>>>>>    <profile>
> > >>>>>>>>>      <id>flink-1.6.0</id>
> > >>>>>>>>>      <repositories>
> > >>>>>>>>>        <repository>
> > >>>>>>>>>          <id>flink-1.6.0</id>
> > >>>>>>>>>          <url>
> > >>>>>>>>>
> > >>>>>>>>>
> > https://repository.apache.org/content/repositories/orgapacheflink-1178/
> > >>>>>>>>>
> > >>>>>>>>>          </url>
> > >>>>>>>>>        </repository>
> > >>>>>>>>>        <repository>
> > >>>>>>>>>          <id>archetype</id>
> > >>>>>>>>>          <url>
> > >>>>>>>>>
> > >>>>>>>>>
> > https://repository.apache.org/content/repositories/orgapacheflink-1178/
> > >>>>>>>>>
> > >>>>>>>>>          </url>
> > >>>>>>>>>        </repository>
> > >>>>>>>>>      </repositories>
> > >>>>>>>>>    </profile>
> > >>>>>>>>> </profiles>
> > >>>>>>>>> </settings>
> > >>>>>>>>>
> > >>>>>>>>> And reference that in you maven commands via --settings
> > >>>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
> > >>>>>>>>> based on
> > >>>>>>> the
> > >>>>>>>>> staged release and for building against the staged jars.
> > >>>
> > >
> >
> >
>

Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Till Rohrmann <tr...@apache.org>.
Thanks for reporting this problem Yaz. I just pushed a commit which should
update the links accordingly once the Flink documentation gets rebuilt
(over night). Tomorrow it should be fixed.

Cheers,
Till

On Thu, Aug 9, 2018 at 2:53 PM Yaz Sh <ya...@gmail.com> wrote:

> Great Job on Release 1.6!
>
> I just checked it out and still I can see v.1.6-SNAPSHOT on the title of
> https://ci.apache.org/projects/flink/flink-docs-release-1.6/ <
> https://ci.apache.org/projects/flink/flink-docs-release-1.6/>
>
> and when I click on any options, it redirects me to master docs
> 1.7-SNAPSHOT.
>
> I opened this ticket https://issues.apache.org/jira/browse/FLINK-10112 <
> https://issues.apache.org/jira/browse/FLINK-10112>
>
> Also I don’t see v1.6 on “Pick Docs Version" drop down
>
> Cheers,
> Yaz
>
> > On Aug 8, 2018, at 3:24 PM, Timo Walther <tw...@apache.org> wrote:
> >
> > +1
> >
> > - successfully run `mvn clean verify` locally
> > - successfully run end-to-end tests locally (except for SQL Client
> end-to-end test)
> >
> > Found a bug in the class loading of SQL JAR files. This is not a blocker
> but a bug that we should fix soon. As an easy workaround user should not
> use different Kafka versions as SQL Client dependencies.
> >
> > Regards,
> > Timo
> >
> > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> >> +1
> >>
> >> - verified compilation, tests
> >> - verified checksum and gpg files
> >> - verified sbt templates (g8, quickstart) - run assemblies on local
> cluster
> >>
> >> - I could not execute the nightly-tests.sh though. The tests that were
> >> failing most often are:
> >>     - test_streaming_file_sink.sh
> >>     - test_streaming_elasticsearch.sh
> >>
> >> Those are connectors though and it might be only tests flakiness so I
> >> think it should not block the release.
> >>
> >> On 08/08/18 16:36, Chesnay Schepler wrote:
> >>> I did not use the tools/list_deps.py script as I wasn't aware that it
> >>> existed.
> >>>
> >>> Even if I were I wouldn't have used it and in fact would advocate for
> >>> removing it.
> >>> It manually parses and constructs dependency information which is
> >>> utterly unnecessary as maven already provides this functionality, with
> >>> the added bonus of also accounting for dependencyManagement and
> >>> transitive dependencies which we obviously have to take into account.
> >>>
> >>> I used this one-liner instead:
> >>> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> >>> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> >>> |'s/:[a-z]*$//g'| || ||sort| |-u
> >>>
> >>> |which I have documented here:
> >>> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
> >>>
> >>> On 08.08.2018 15:06, Aljoscha Krettek wrote:
> >>>> +1
> >>>>
> >>>> - verified checksum and gpg files
> >>>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
> >>>> had one unnecessary part removed
> >>>>
> >>>> Side comment: I'm not sure whether the "Verify that the LICENSE and
> >>>> NOTICE file is correct for the binary and source releases" part is
> >>>> valid anymore because we only have one LICENSE and NOTICE file. also
> >>>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
> >>>> to the binary distribution and mention all of Flink's Maven
> >>>> dependencies as well" can be dropped because we don't have them
> anymore.
> >>>>
> >>>> I came to the same conclusion on dependencies. I used
> >>>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
> >>>> probably what Chesnay also did ... :-)
> >>>>
> >>>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org>
> wrote:
> >>>>>
> >>>>> +1
> >>>>>
> >>>>> - verified source release contains no binaries
> >>>>> - verified correct versions in source release
> >>>>> - verified compilation, tests and E2E-tests pass (on travis)
> >>>>> - verified checksum and gpg files
> >>>>>
> >>>>> New dependencies (excluding dependencies where we simply depend on a
> >>>>> different version now):
> >>>>>     Apache licensed:
> >>>>>         io.confluent:common-utils:jar:3.3.1
> >>>>>         io.confluent:kafka-schema-registry-client:jar:3.3.1
> >>>>>         io.prometheus:simpleclient_pushgateway:jar:0.3.0
> >>>>>         various Apache Nifi dependencies
> >>>>>         various Apache Parquet dependencies
> >>>>>         various ElasticSearch dependencies
> >>>>>     CDDL:
> >>>>>         javax.ws.rs:javax.ws.rs-api:jar:2.1
> >>>>>     Bouncycastle (MIT-like):
> >>>>>         org.bouncycastle:bcpkix-jdk15on:jar:1.59
> >>>>>         org.bouncycastle:bcprov-jdk15on:jar:1.59
> >>>>>     MIT:
> >>>>>         org.projectlombok:lombok:jar:1.16.20
> >>>>>
> >>>>> On 08.08.2018 13:28, Till Rohrmann wrote:
> >>>>>> Thanks for reporting these problems Chesnay. The usage string in
> >>>>>> `standalone-job.sh` is out dated and should be updated. The same
> >>>>>> applies to
> >>>>>> the typo.
> >>>>>>
> >>>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
> >>>>>> please
> >>>>>> make sure that the user code jar is contained in the classpath (e.g.
> >>>>>> putting the jar in the lib directory). Documenting this behaviour
> >>>>>> is part
> >>>>>> of the pending issue FLINK-10001.
> >>>>>>
> >>>>>> We should fix all of these issues. They are, however, no release
> >>>>>> blockers.
> >>>>>>
> >>>>>> Cheers,
> >>>>>> Till
> >>>>>>
> >>>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
> >>>>>> <ch...@apache.org> wrote:
> >>>>>>
> >>>>>>> I found some issues with the standalone-job.sh script.
> >>>>>>>
> >>>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
> >>>>>>> string.
> >>>>>>>
> >>>>>>>      2018-08-08 09:22:34,385 ERROR
> >>>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint
>  -
> >>>>>>>      Could not parse command line arguments [--configDir,
> >>>>>>>      /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
> >>>>>>>      org.apache.flink.runtime.entrypoint.FlinkParseException:
> >>>>>>> Failed to
> >>>>>>>      parse the command line arguments.
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
> >>>>>>>
> >>>>>>>      Caused by: org.apache.commons.cli.MissingOptionException:
> >>>>>>> Missing
> >>>>>>>      required option: j
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
>  org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
> >>>>>>>               at
> >>>>>>>
>  org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
> >>>>>>>
> >>>>>>>              ... 1 more
> >>>>>>>
> >>>>>>> The script should fail earlier if no jar is provided, with a better
> >>>>>>> error message.
> >>>>>>> It is also undocumented, and the usage instructions don't appear
> >>>>>>> correct.
> >>>>>>>
> >>>>>>> Passing a jar with the -j option leads to a ClassNotFoundException:
> >>>>>>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
> >>>>>>>
> >>>>>>>      2018-08-08 09:26:30,562 ERROR
> >>>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint
>  -
> >>>>>>>      Cluster initialization failed.
> >>>>>>>      java.lang.reflect.UndeclaredThrowableException
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
> >>>>>>>
> >>>>>>>      Caused by: org.apache.flink.util.FlinkException: Could not
> >>>>>>> load the
> >>>>>>>      provied entrypoint class.
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
> >>>>>>>
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
> >>>>>>>
> >>>>>>>               at
> >>>>>>> java.security.AccessController.doPrivileged(Native Method)
> >>>>>>>               at javax.security.auth.Subject.doAs(Subject.java:422)
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
> >>>>>>>
> >>>>>>>               ... 3 more
> >>>>>>>      Caused by: java.lang.ClassNotFoundException:
> >>>>>>>      examples/streaming/WordCount.jar
> >>>>>>>               at
> >>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >>>>>>>               at
> >>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >>>>>>>               at
> >>>>>>>      sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
> >>>>>>>               at
> >>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> >>>>>>>               at
> >>>>>>>
> >>>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
> >>>>>>>
> >>>>>>>               ... 11 more
> >>>>>>>
> >>>>>>> So this seems to not work at all, but maybe I'm using it wrong?
> >>>>>>>
> >>>>>>> (There's also typo in "Could not load the provied entrypoint
> class")
> >>>>>>>
> >>>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
> >>>>>>>> +1 from my side
> >>>>>>>>
> >>>>>>>> I’ve spent some time playing around with various examples
> (batching,
> >>>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment,
> with
> >>>>>>> different configuration options (number of task
> >>>>>>> managers/memory/Flip6/credit base flow control/metrics) and
> >>>>>>> everything
> >>>>>>> looks now fine (after fixing
> >>>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
> >>>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
> >>>>>>>> Piotrek
> >>>>>>>>
> >>>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
> >>>>>>>>> wrote:
> >>>>>>>>>
> >>>>>>>>> Hi everyone,
> >>>>>>>>> Please review and vote on the release candidate #4 for the
> version
> >>>>>>> 1.6.0,
> >>>>>>>>> as follows:
> >>>>>>>>> [ ] +1, Approve the release
> >>>>>>>>> [ ] -1, Do not approve the release (please provide specific
> >>>>>>>>> comments)
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> The complete staging area is available for your review, which
> >>>>>>>>> includes:
> >>>>>>>>> * JIRA release notes [1],
> >>>>>>>>> * the official Apache source release and binary convenience
> >>>>>>>>> releases to
> >>>>>>> be
> >>>>>>>>> deployed to dist.apache.org [2], which are signed with the key
> with
> >>>>>>>>> fingerprint 1F302569A96CFFD5 [3],
> >>>>>>>>> * all artifacts to be deployed to the Maven Central Repository
> [4],
> >>>>>>>>> * source code tag "release-1.6.0-rc4" [5],
> >>>>>>>>> * website pull request listing the new release and adding
> >>>>>>>>> announcement
> >>>>>>> blog
> >>>>>>>>> post [6].
> >>>>>>>>>
> >>>>>>>>> Please use this document for coordinating testing efforts: [7]
> >>>>>>>>>
> >>>>>>>>> The vote will be shortened since we only adde a minor fix on top
> >>>>>>>>> of the
> >>>>>>> RC
> >>>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by
> majority
> >>>>>>>>> approval, with at least 3 PMC affirmative votes.
> >>>>>>>>>
> >>>>>>>>> Thanks,
> >>>>>>>>> Your friendly Release Manager
> >>>>>>>>>
> >>>>>>>>> [1]
> >>>>>>>>>
> >>>>>>>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
> >>>>>>>
> >>>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
> >>>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> >>>>>>>>> [4]
> >>>>>>>
> https://repository.apache.org/content/repositories/orgapacheflink-1178
> >>>>>>>
> >>>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
> >>>>>>>>> [6] https://github.com/apache/flink-web/pull/117
> >>>>>>>>> [7]
> >>>>>>>>>
> >>>>>>>
> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
> >>>>>>>
> >>>>>>>>> Pro-tip: you can create a settings.xml file with these contents:
> >>>>>>>>>
> >>>>>>>>> <settings>
> >>>>>>>>> <activeProfiles>
> >>>>>>>>>    <activeProfile>flink-1.6.0</activeProfile>
> >>>>>>>>> </activeProfiles>
> >>>>>>>>> <profiles>
> >>>>>>>>>    <profile>
> >>>>>>>>>      <id>flink-1.6.0</id>
> >>>>>>>>>      <repositories>
> >>>>>>>>>        <repository>
> >>>>>>>>>          <id>flink-1.6.0</id>
> >>>>>>>>>          <url>
> >>>>>>>>>
> >>>>>>>>>
> https://repository.apache.org/content/repositories/orgapacheflink-1178/
> >>>>>>>>>
> >>>>>>>>>          </url>
> >>>>>>>>>        </repository>
> >>>>>>>>>        <repository>
> >>>>>>>>>          <id>archetype</id>
> >>>>>>>>>          <url>
> >>>>>>>>>
> >>>>>>>>>
> https://repository.apache.org/content/repositories/orgapacheflink-1178/
> >>>>>>>>>
> >>>>>>>>>          </url>
> >>>>>>>>>        </repository>
> >>>>>>>>>      </repositories>
> >>>>>>>>>    </profile>
> >>>>>>>>> </profiles>
> >>>>>>>>> </settings>
> >>>>>>>>>
> >>>>>>>>> And reference that in you maven commands via --settings
> >>>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
> >>>>>>>>> based on
> >>>>>>> the
> >>>>>>>>> staged release and for building against the staged jars.
> >>>
> >
>
>

Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Yaz Sh <ya...@gmail.com>.
Great Job on Release 1.6!

I just checked it out and still I can see v.1.6-SNAPSHOT on the title of https://ci.apache.org/projects/flink/flink-docs-release-1.6/ <https://ci.apache.org/projects/flink/flink-docs-release-1.6/>

and when I click on any options, it redirects me to master docs 1.7-SNAPSHOT.

I opened this ticket https://issues.apache.org/jira/browse/FLINK-10112 <https://issues.apache.org/jira/browse/FLINK-10112>

Also I don’t see v1.6 on “Pick Docs Version" drop down

Cheers,
Yaz

> On Aug 8, 2018, at 3:24 PM, Timo Walther <tw...@apache.org> wrote:
> 
> +1
> 
> - successfully run `mvn clean verify` locally
> - successfully run end-to-end tests locally (except for SQL Client end-to-end test)
> 
> Found a bug in the class loading of SQL JAR files. This is not a blocker but a bug that we should fix soon. As an easy workaround user should not use different Kafka versions as SQL Client dependencies.
> 
> Regards,
> Timo
> 
> Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
>> +1
>> 
>> - verified compilation, tests
>> - verified checksum and gpg files
>> - verified sbt templates (g8, quickstart) - run assemblies on local cluster
>> 
>> - I could not execute the nightly-tests.sh though. The tests that were
>> failing most often are:
>>     - test_streaming_file_sink.sh
>>     - test_streaming_elasticsearch.sh
>> 
>> Those are connectors though and it might be only tests flakiness so I
>> think it should not block the release.
>> 
>> On 08/08/18 16:36, Chesnay Schepler wrote:
>>> I did not use the tools/list_deps.py script as I wasn't aware that it
>>> existed.
>>> 
>>> Even if I were I wouldn't have used it and in fact would advocate for
>>> removing it.
>>> It manually parses and constructs dependency information which is
>>> utterly unnecessary as maven already provides this functionality, with
>>> the added bonus of also accounting for dependencyManagement and
>>> transitive dependencies which we obviously have to take into account.
>>> 
>>> I used this one-liner instead:
>>> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
>>> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
>>> |'s/:[a-z]*$//g'| || ||sort| |-u
>>> 
>>> |which I have documented here:
>>> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
>>> 
>>> On 08.08.2018 15:06, Aljoscha Krettek wrote:
>>>> +1
>>>> 
>>>> - verified checksum and gpg files
>>>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
>>>> had one unnecessary part removed
>>>> 
>>>> Side comment: I'm not sure whether the "Verify that the LICENSE and
>>>> NOTICE file is correct for the binary and source releases" part is
>>>> valid anymore because we only have one LICENSE and NOTICE file. also
>>>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
>>>> to the binary distribution and mention all of Flink's Maven
>>>> dependencies as well" can be dropped because we don't have them anymore.
>>>> 
>>>> I came to the same conclusion on dependencies. I used
>>>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
>>>> probably what Chesnay also did ... :-)
>>>> 
>>>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org> wrote:
>>>>> 
>>>>> +1
>>>>> 
>>>>> - verified source release contains no binaries
>>>>> - verified correct versions in source release
>>>>> - verified compilation, tests and E2E-tests pass (on travis)
>>>>> - verified checksum and gpg files
>>>>> 
>>>>> New dependencies (excluding dependencies where we simply depend on a
>>>>> different version now):
>>>>>     Apache licensed:
>>>>>         io.confluent:common-utils:jar:3.3.1
>>>>>         io.confluent:kafka-schema-registry-client:jar:3.3.1
>>>>>         io.prometheus:simpleclient_pushgateway:jar:0.3.0
>>>>>         various Apache Nifi dependencies
>>>>>         various Apache Parquet dependencies
>>>>>         various ElasticSearch dependencies
>>>>>     CDDL:
>>>>>         javax.ws.rs:javax.ws.rs-api:jar:2.1
>>>>>     Bouncycastle (MIT-like):
>>>>>         org.bouncycastle:bcpkix-jdk15on:jar:1.59
>>>>>         org.bouncycastle:bcprov-jdk15on:jar:1.59
>>>>>     MIT:
>>>>>         org.projectlombok:lombok:jar:1.16.20
>>>>> 
>>>>> On 08.08.2018 13:28, Till Rohrmann wrote:
>>>>>> Thanks for reporting these problems Chesnay. The usage string in
>>>>>> `standalone-job.sh` is out dated and should be updated. The same
>>>>>> applies to
>>>>>> the typo.
>>>>>> 
>>>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
>>>>>> please
>>>>>> make sure that the user code jar is contained in the classpath (e.g.
>>>>>> putting the jar in the lib directory). Documenting this behaviour
>>>>>> is part
>>>>>> of the pending issue FLINK-10001.
>>>>>> 
>>>>>> We should fix all of these issues. They are, however, no release
>>>>>> blockers.
>>>>>> 
>>>>>> Cheers,
>>>>>> Till
>>>>>> 
>>>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
>>>>>> <ch...@apache.org> wrote:
>>>>>> 
>>>>>>> I found some issues with the standalone-job.sh script.
>>>>>>> 
>>>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
>>>>>>> string.
>>>>>>> 
>>>>>>>      2018-08-08 09:22:34,385 ERROR
>>>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>>>>      Could not parse command line arguments [--configDir,
>>>>>>>      /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>>>>>>>      org.apache.flink.runtime.entrypoint.FlinkParseException:
>>>>>>> Failed to
>>>>>>>      parse the command line arguments.
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>>>>>>> 
>>>>>>>      Caused by: org.apache.commons.cli.MissingOptionException:
>>>>>>> Missing
>>>>>>>      required option: j
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>>>>>>> 
>>>>>>>               at
>>>>>>>     org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>>>>>>>               at
>>>>>>>     org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>>>>>>> 
>>>>>>>              ... 1 more
>>>>>>> 
>>>>>>> The script should fail earlier if no jar is provided, with a better
>>>>>>> error message.
>>>>>>> It is also undocumented, and the usage instructions don't appear
>>>>>>> correct.
>>>>>>> 
>>>>>>> Passing a jar with the -j option leads to a ClassNotFoundException:
>>>>>>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
>>>>>>> 
>>>>>>>      2018-08-08 09:26:30,562 ERROR
>>>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>>>>      Cluster initialization failed.
>>>>>>>      java.lang.reflect.UndeclaredThrowableException
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>>>>>>> 
>>>>>>>      Caused by: org.apache.flink.util.FlinkException: Could not
>>>>>>> load the
>>>>>>>      provied entrypoint class.
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>>>>>>> 
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>>>>>>> 
>>>>>>>               at
>>>>>>> java.security.AccessController.doPrivileged(Native Method)
>>>>>>>               at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>>>>> 
>>>>>>>               ... 3 more
>>>>>>>      Caused by: java.lang.ClassNotFoundException:
>>>>>>>      examples/streaming/WordCount.jar
>>>>>>>               at
>>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>>>>               at
>>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>>>>               at
>>>>>>>      sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>>>>>>>               at
>>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>>>>               at
>>>>>>> 
>>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>>>>>>> 
>>>>>>>               ... 11 more
>>>>>>> 
>>>>>>> So this seems to not work at all, but maybe I'm using it wrong?
>>>>>>> 
>>>>>>> (There's also typo in "Could not load the provied entrypoint class")
>>>>>>> 
>>>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
>>>>>>>> +1 from my side
>>>>>>>> 
>>>>>>>> I’ve spent some time playing around with various examples (batching,
>>>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
>>>>>>> different configuration options (number of task
>>>>>>> managers/memory/Flip6/credit base flow control/metrics) and
>>>>>>> everything
>>>>>>> looks now fine (after fixing
>>>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
>>>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
>>>>>>>> Piotrek
>>>>>>>> 
>>>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
>>>>>>>>> wrote:
>>>>>>>>> 
>>>>>>>>> Hi everyone,
>>>>>>>>> Please review and vote on the release candidate #4 for the version
>>>>>>> 1.6.0,
>>>>>>>>> as follows:
>>>>>>>>> [ ] +1, Approve the release
>>>>>>>>> [ ] -1, Do not approve the release (please provide specific
>>>>>>>>> comments)
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> The complete staging area is available for your review, which
>>>>>>>>> includes:
>>>>>>>>> * JIRA release notes [1],
>>>>>>>>> * the official Apache source release and binary convenience
>>>>>>>>> releases to
>>>>>>> be
>>>>>>>>> deployed to dist.apache.org [2], which are signed with the key with
>>>>>>>>> fingerprint 1F302569A96CFFD5 [3],
>>>>>>>>> * all artifacts to be deployed to the Maven Central Repository [4],
>>>>>>>>> * source code tag "release-1.6.0-rc4" [5],
>>>>>>>>> * website pull request listing the new release and adding
>>>>>>>>> announcement
>>>>>>> blog
>>>>>>>>> post [6].
>>>>>>>>> 
>>>>>>>>> Please use this document for coordinating testing efforts: [7]
>>>>>>>>> 
>>>>>>>>> The vote will be shortened since we only adde a minor fix on top
>>>>>>>>> of the
>>>>>>> RC
>>>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
>>>>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>>>> 
>>>>>>>>> Thanks,
>>>>>>>>> Your friendly Release Manager
>>>>>>>>> 
>>>>>>>>> [1]
>>>>>>>>> 
>>>>>>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>>>>>>> 
>>>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>>>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>>>>> [4]
>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178
>>>>>>> 
>>>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>>>>>>>>> [6] https://github.com/apache/flink-web/pull/117
>>>>>>>>> [7]
>>>>>>>>> 
>>>>>>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>>>>>>> 
>>>>>>>>> Pro-tip: you can create a settings.xml file with these contents:
>>>>>>>>> 
>>>>>>>>> <settings>
>>>>>>>>> <activeProfiles>
>>>>>>>>>    <activeProfile>flink-1.6.0</activeProfile>
>>>>>>>>> </activeProfiles>
>>>>>>>>> <profiles>
>>>>>>>>>    <profile>
>>>>>>>>>      <id>flink-1.6.0</id>
>>>>>>>>>      <repositories>
>>>>>>>>>        <repository>
>>>>>>>>>          <id>flink-1.6.0</id>
>>>>>>>>>          <url>
>>>>>>>>> 
>>>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>>>> 
>>>>>>>>>          </url>
>>>>>>>>>        </repository>
>>>>>>>>>        <repository>
>>>>>>>>>          <id>archetype</id>
>>>>>>>>>          <url>
>>>>>>>>> 
>>>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>>>> 
>>>>>>>>>          </url>
>>>>>>>>>        </repository>
>>>>>>>>>      </repositories>
>>>>>>>>>    </profile>
>>>>>>>>> </profiles>
>>>>>>>>> </settings>
>>>>>>>>> 
>>>>>>>>> And reference that in you maven commands via --settings
>>>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
>>>>>>>>> based on
>>>>>>> the
>>>>>>>>> staged release and for building against the staged jars.
>>> 
> 


Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Till Rohrmann <tr...@apache.org>.
I hereby close the vote. The result will be announced in a separate thread.

On Thu, Aug 9, 2018 at 11:47 AM Till Rohrmann <tr...@apache.org> wrote:

> +1
>
> - Checked checksums and signatures
> - Verified that no unwanted binaries are contained in source release
> - Checked LICENSE and NOTICE file
> - Checked that all newly added dependencies have a compatible license
> - Checked that a local cluster can be started and stopped without
> exceptions in the log
> - Verified that SBT quickstarts are up to date
> - Verified that Java quickstarts work with IntelliJ
> - Verified that all Jepsen tests pass
> - Verified that e2e tests modulo test_sql_client.sh (see
> https://issues.apache.org/jira/browse/FLINK-10107) pass
>
> Cheers,
> Till
>
> On Thu, Aug 9, 2018 at 8:18 AM vino yang <ya...@gmail.com> wrote:
>
>> +1
>>
>> - checkout 1.6 source code and successfully run `mvn clean package
>> -DskipTests`
>> - searched '1.5' and '1.5.2' in all modules pom file and successfully
>> verified flink version was changed
>> - successfully run table and sql test locally
>>
>> Thanks, vino.
>>
>>
>> Timo Walther <tw...@apache.org> 于2018年8月9日周四 上午3:24写道:
>>
>> > +1
>> >
>> > - successfully run `mvn clean verify` locally
>> > - successfully run end-to-end tests locally (except for SQL Client
>> > end-to-end test)
>> >
>> > Found a bug in the class loading of SQL JAR files. This is not a blocker
>> > but a bug that we should fix soon. As an easy workaround user should not
>> > use different Kafka versions as SQL Client dependencies.
>> >
>> > Regards,
>> > Timo
>> >
>> > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
>> > > +1
>> > >
>> > > - verified compilation, tests
>> > > - verified checksum and gpg files
>> > > - verified sbt templates (g8, quickstart) - run assemblies on local
>> > cluster
>> > >
>> > > - I could not execute the nightly-tests.sh though. The tests that were
>> > > failing most often are:
>> > >      - test_streaming_file_sink.sh
>> > >      - test_streaming_elasticsearch.sh
>> > >
>> > > Those are connectors though and it might be only tests flakiness so I
>> > > think it should not block the release.
>> > >
>> > > On 08/08/18 16:36, Chesnay Schepler wrote:
>> > >> I did not use the tools/list_deps.py script as I wasn't aware that it
>> > >> existed.
>> > >>
>> > >> Even if I were I wouldn't have used it and in fact would advocate for
>> > >> removing it.
>> > >> It manually parses and constructs dependency information which is
>> > >> utterly unnecessary as maven already provides this functionality,
>> with
>> > >> the added bonus of also accounting for dependencyManagement and
>> > >> transitive dependencies which we obviously have to take into account.
>> > >>
>> > >> I used this one-liner instead:
>> > >> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
>> > >> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
>> > >> |'s/:[a-z]*$//g'| || ||sort| |-u
>> > >>
>> > >> |which I have documented here:
>> > >> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
>> > >>
>> > >> On 08.08.2018 15:06, Aljoscha Krettek wrote:
>> > >>> +1
>> > >>>
>> > >>> - verified checksum and gpg files
>> > >>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5,
>> LICENSE
>> > >>> had one unnecessary part removed
>> > >>>
>> > >>> Side comment: I'm not sure whether the "Verify that the LICENSE and
>> > >>> NOTICE file is correct for the binary and source releases" part is
>> > >>> valid anymore because we only have one LICENSE and NOTICE file. also
>> > >>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
>> > >>> to the binary distribution and mention all of Flink's Maven
>> > >>> dependencies as well" can be dropped because we don't have them
>> > anymore.
>> > >>>
>> > >>> I came to the same conclusion on dependencies. I used
>> > >>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
>> > >>> probably what Chesnay also did ... :-)
>> > >>>
>> > >>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org>
>> > wrote:
>> > >>>>
>> > >>>> +1
>> > >>>>
>> > >>>> - verified source release contains no binaries
>> > >>>> - verified correct versions in source release
>> > >>>> - verified compilation, tests and E2E-tests pass (on travis)
>> > >>>> - verified checksum and gpg files
>> > >>>>
>> > >>>> New dependencies (excluding dependencies where we simply depend on
>> a
>> > >>>> different version now):
>> > >>>>      Apache licensed:
>> > >>>>          io.confluent:common-utils:jar:3.3.1
>> > >>>>          io.confluent:kafka-schema-registry-client:jar:3.3.1
>> > >>>>          io.prometheus:simpleclient_pushgateway:jar:0.3.0
>> > >>>>          various Apache Nifi dependencies
>> > >>>>          various Apache Parquet dependencies
>> > >>>>          various ElasticSearch dependencies
>> > >>>>      CDDL:
>> > >>>>          javax.ws.rs:javax.ws.rs-api:jar:2.1
>> > >>>>      Bouncycastle (MIT-like):
>> > >>>>          org.bouncycastle:bcpkix-jdk15on:jar:1.59
>> > >>>>          org.bouncycastle:bcprov-jdk15on:jar:1.59
>> > >>>>      MIT:
>> > >>>>          org.projectlombok:lombok:jar:1.16.20
>> > >>>>
>> > >>>> On 08.08.2018 13:28, Till Rohrmann wrote:
>> > >>>>> Thanks for reporting these problems Chesnay. The usage string in
>> > >>>>> `standalone-job.sh` is out dated and should be updated. The same
>> > >>>>> applies to
>> > >>>>> the typo.
>> > >>>>>
>> > >>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
>> > >>>>> please
>> > >>>>> make sure that the user code jar is contained in the classpath
>> (e.g.
>> > >>>>> putting the jar in the lib directory). Documenting this behaviour
>> > >>>>> is part
>> > >>>>> of the pending issue FLINK-10001.
>> > >>>>>
>> > >>>>> We should fix all of these issues. They are, however, no release
>> > >>>>> blockers.
>> > >>>>>
>> > >>>>> Cheers,
>> > >>>>> Till
>> > >>>>>
>> > >>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
>> > >>>>> <ch...@apache.org> wrote:
>> > >>>>>
>> > >>>>>> I found some issues with the standalone-job.sh script.
>> > >>>>>>
>> > >>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
>> > >>>>>> string.
>> > >>>>>>
>> > >>>>>>       2018-08-08 09:22:34,385 ERROR
>> > >>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint
>> > -
>> > >>>>>>       Could not parse command line arguments [--configDir,
>> > >>>>>>       /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>> > >>>>>>       org.apache.flink.runtime.entrypoint.FlinkParseException:
>> > >>>>>> Failed to
>> > >>>>>>       parse the command line arguments.
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>> > >>>>>>
>> > >>>>>>       Caused by: org.apache.commons.cli.MissingOptionException:
>> > >>>>>> Missing
>> > >>>>>>       required option: j
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>> > >>>>>>
>> > >>>>>>               ... 1 more
>> > >>>>>>
>> > >>>>>> The script should fail earlier if no jar is provided, with a
>> better
>> > >>>>>> error message.
>> > >>>>>> It is also undocumented, and the usage instructions don't appear
>> > >>>>>> correct.
>> > >>>>>>
>> > >>>>>> Passing a jar with the -j option leads to a
>> ClassNotFoundException:
>> > >>>>>> "./bin/standalone-job.sh start -j
>> examples/streaming/WordCount.jar"
>> > >>>>>>
>> > >>>>>>       2018-08-08 09:26:30,562 ERROR
>> > >>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint
>> > -
>> > >>>>>>       Cluster initialization failed.
>> > >>>>>>       java.lang.reflect.UndeclaredThrowableException
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>> > >>>>>>
>> > >>>>>>       Caused by: org.apache.flink.util.FlinkException: Could not
>> > >>>>>> load the
>> > >>>>>>       provied entrypoint class.
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>> > >>>>>>
>> > >>>>>>                at
>> > >>>>>> java.security.AccessController.doPrivileged(Native Method)
>> > >>>>>>                at
>> javax.security.auth.Subject.doAs(Subject.java:422)
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>> > >>>>>>
>> > >>>>>>                ... 3 more
>> > >>>>>>       Caused by: java.lang.ClassNotFoundException:
>> > >>>>>>       examples/streaming/WordCount.jar
>> > >>>>>>                at
>> > >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>> > >>>>>>                at
>> > >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> > >>>>>>                at
>> > >>>>>>
>>  sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>> > >>>>>>                at
>> > >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> > >>>>>>                at
>> > >>>>>>
>> > >>>>>>
>> >
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>> > >>>>>>
>> > >>>>>>                ... 11 more
>> > >>>>>>
>> > >>>>>> So this seems to not work at all, but maybe I'm using it wrong?
>> > >>>>>>
>> > >>>>>> (There's also typo in "Could not load the provied entrypoint
>> class")
>> > >>>>>>
>> > >>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
>> > >>>>>>> +1 from my side
>> > >>>>>>>
>> > >>>>>>> I’ve spent some time playing around with various examples
>> > (batching,
>> > >>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment,
>> with
>> > >>>>>> different configuration options (number of task
>> > >>>>>> managers/memory/Flip6/credit base flow control/metrics) and
>> > >>>>>> everything
>> > >>>>>> looks now fine (after fixing
>> > >>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
>> > >>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
>> > >>>>>>> Piotrek
>> > >>>>>>>
>> > >>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
>> > >>>>>>>> wrote:
>> > >>>>>>>>
>> > >>>>>>>> Hi everyone,
>> > >>>>>>>> Please review and vote on the release candidate #4 for the
>> version
>> > >>>>>> 1.6.0,
>> > >>>>>>>> as follows:
>> > >>>>>>>> [ ] +1, Approve the release
>> > >>>>>>>> [ ] -1, Do not approve the release (please provide specific
>> > >>>>>>>> comments)
>> > >>>>>>>>
>> > >>>>>>>>
>> > >>>>>>>> The complete staging area is available for your review, which
>> > >>>>>>>> includes:
>> > >>>>>>>> * JIRA release notes [1],
>> > >>>>>>>> * the official Apache source release and binary convenience
>> > >>>>>>>> releases to
>> > >>>>>> be
>> > >>>>>>>> deployed to dist.apache.org [2], which are signed with the key
>> > with
>> > >>>>>>>> fingerprint 1F302569A96CFFD5 [3],
>> > >>>>>>>> * all artifacts to be deployed to the Maven Central Repository
>> > [4],
>> > >>>>>>>> * source code tag "release-1.6.0-rc4" [5],
>> > >>>>>>>> * website pull request listing the new release and adding
>> > >>>>>>>> announcement
>> > >>>>>> blog
>> > >>>>>>>> post [6].
>> > >>>>>>>>
>> > >>>>>>>> Please use this document for coordinating testing efforts: [7]
>> > >>>>>>>>
>> > >>>>>>>> The vote will be shortened since we only adde a minor fix on
>> top
>> > >>>>>>>> of the
>> > >>>>>> RC
>> > >>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by
>> > majority
>> > >>>>>>>> approval, with at least 3 PMC affirmative votes.
>> > >>>>>>>>
>> > >>>>>>>> Thanks,
>> > >>>>>>>> Your friendly Release Manager
>> > >>>>>>>>
>> > >>>>>>>> [1]
>> > >>>>>>>>
>> > >>>>>>
>> >
>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>> > >>>>>>
>> > >>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>> > >>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>> > >>>>>>>> [4]
>> > >>>>>>
>> > https://repository.apache.org/content/repositories/orgapacheflink-1178
>> > >>>>>>
>> > >>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>> > >>>>>>>> [6] https://github.com/apache/flink-web/pull/117
>> > >>>>>>>> [7]
>> > >>>>>>>>
>> > >>>>>>
>> >
>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>> > >>>>>>
>> > >>>>>>>> Pro-tip: you can create a settings.xml file with these
>> contents:
>> > >>>>>>>>
>> > >>>>>>>> <settings>
>> > >>>>>>>> <activeProfiles>
>> > >>>>>>>>     <activeProfile>flink-1.6.0</activeProfile>
>> > >>>>>>>> </activeProfiles>
>> > >>>>>>>> <profiles>
>> > >>>>>>>>     <profile>
>> > >>>>>>>>       <id>flink-1.6.0</id>
>> > >>>>>>>>       <repositories>
>> > >>>>>>>>         <repository>
>> > >>>>>>>>           <id>flink-1.6.0</id>
>> > >>>>>>>>           <url>
>> > >>>>>>>>
>> > >>>>>>>>
>> > https://repository.apache.org/content/repositories/orgapacheflink-1178/
>> > >>>>>>>>
>> > >>>>>>>>           </url>
>> > >>>>>>>>         </repository>
>> > >>>>>>>>         <repository>
>> > >>>>>>>>           <id>archetype</id>
>> > >>>>>>>>           <url>
>> > >>>>>>>>
>> > >>>>>>>>
>> > https://repository.apache.org/content/repositories/orgapacheflink-1178/
>> > >>>>>>>>
>> > >>>>>>>>           </url>
>> > >>>>>>>>         </repository>
>> > >>>>>>>>       </repositories>
>> > >>>>>>>>     </profile>
>> > >>>>>>>> </profiles>
>> > >>>>>>>> </settings>
>> > >>>>>>>>
>> > >>>>>>>> And reference that in you maven commands via --settings
>> > >>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
>> > >>>>>>>> based on
>> > >>>>>> the
>> > >>>>>>>> staged release and for building against the staged jars.
>> > >>
>> >
>> >
>>
>

Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Till Rohrmann <tr...@apache.org>.
+1

- Checked checksums and signatures
- Verified that no unwanted binaries are contained in source release
- Checked LICENSE and NOTICE file
- Checked that all newly added dependencies have a compatible license
- Checked that a local cluster can be started and stopped without
exceptions in the log
- Verified that SBT quickstarts are up to date
- Verified that Java quickstarts work with IntelliJ
- Verified that all Jepsen tests pass
- Verified that e2e tests modulo test_sql_client.sh (see
https://issues.apache.org/jira/browse/FLINK-10107) pass

Cheers,
Till

On Thu, Aug 9, 2018 at 8:18 AM vino yang <ya...@gmail.com> wrote:

> +1
>
> - checkout 1.6 source code and successfully run `mvn clean package
> -DskipTests`
> - searched '1.5' and '1.5.2' in all modules pom file and successfully
> verified flink version was changed
> - successfully run table and sql test locally
>
> Thanks, vino.
>
>
> Timo Walther <tw...@apache.org> 于2018年8月9日周四 上午3:24写道:
>
> > +1
> >
> > - successfully run `mvn clean verify` locally
> > - successfully run end-to-end tests locally (except for SQL Client
> > end-to-end test)
> >
> > Found a bug in the class loading of SQL JAR files. This is not a blocker
> > but a bug that we should fix soon. As an easy workaround user should not
> > use different Kafka versions as SQL Client dependencies.
> >
> > Regards,
> > Timo
> >
> > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> > > +1
> > >
> > > - verified compilation, tests
> > > - verified checksum and gpg files
> > > - verified sbt templates (g8, quickstart) - run assemblies on local
> > cluster
> > >
> > > - I could not execute the nightly-tests.sh though. The tests that were
> > > failing most often are:
> > >      - test_streaming_file_sink.sh
> > >      - test_streaming_elasticsearch.sh
> > >
> > > Those are connectors though and it might be only tests flakiness so I
> > > think it should not block the release.
> > >
> > > On 08/08/18 16:36, Chesnay Schepler wrote:
> > >> I did not use the tools/list_deps.py script as I wasn't aware that it
> > >> existed.
> > >>
> > >> Even if I were I wouldn't have used it and in fact would advocate for
> > >> removing it.
> > >> It manually parses and constructs dependency information which is
> > >> utterly unnecessary as maven already provides this functionality, with
> > >> the added bonus of also accounting for dependencyManagement and
> > >> transitive dependencies which we obviously have to take into account.
> > >>
> > >> I used this one-liner instead:
> > >> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> > >> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> > >> |'s/:[a-z]*$//g'| || ||sort| |-u
> > >>
> > >> |which I have documented here:
> > >> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
> > >>
> > >> On 08.08.2018 15:06, Aljoscha Krettek wrote:
> > >>> +1
> > >>>
> > >>> - verified checksum and gpg files
> > >>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
> > >>> had one unnecessary part removed
> > >>>
> > >>> Side comment: I'm not sure whether the "Verify that the LICENSE and
> > >>> NOTICE file is correct for the binary and source releases" part is
> > >>> valid anymore because we only have one LICENSE and NOTICE file. also
> > >>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
> > >>> to the binary distribution and mention all of Flink's Maven
> > >>> dependencies as well" can be dropped because we don't have them
> > anymore.
> > >>>
> > >>> I came to the same conclusion on dependencies. I used
> > >>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
> > >>> probably what Chesnay also did ... :-)
> > >>>
> > >>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org>
> > wrote:
> > >>>>
> > >>>> +1
> > >>>>
> > >>>> - verified source release contains no binaries
> > >>>> - verified correct versions in source release
> > >>>> - verified compilation, tests and E2E-tests pass (on travis)
> > >>>> - verified checksum and gpg files
> > >>>>
> > >>>> New dependencies (excluding dependencies where we simply depend on a
> > >>>> different version now):
> > >>>>      Apache licensed:
> > >>>>          io.confluent:common-utils:jar:3.3.1
> > >>>>          io.confluent:kafka-schema-registry-client:jar:3.3.1
> > >>>>          io.prometheus:simpleclient_pushgateway:jar:0.3.0
> > >>>>          various Apache Nifi dependencies
> > >>>>          various Apache Parquet dependencies
> > >>>>          various ElasticSearch dependencies
> > >>>>      CDDL:
> > >>>>          javax.ws.rs:javax.ws.rs-api:jar:2.1
> > >>>>      Bouncycastle (MIT-like):
> > >>>>          org.bouncycastle:bcpkix-jdk15on:jar:1.59
> > >>>>          org.bouncycastle:bcprov-jdk15on:jar:1.59
> > >>>>      MIT:
> > >>>>          org.projectlombok:lombok:jar:1.16.20
> > >>>>
> > >>>> On 08.08.2018 13:28, Till Rohrmann wrote:
> > >>>>> Thanks for reporting these problems Chesnay. The usage string in
> > >>>>> `standalone-job.sh` is out dated and should be updated. The same
> > >>>>> applies to
> > >>>>> the typo.
> > >>>>>
> > >>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
> > >>>>> please
> > >>>>> make sure that the user code jar is contained in the classpath
> (e.g.
> > >>>>> putting the jar in the lib directory). Documenting this behaviour
> > >>>>> is part
> > >>>>> of the pending issue FLINK-10001.
> > >>>>>
> > >>>>> We should fix all of these issues. They are, however, no release
> > >>>>> blockers.
> > >>>>>
> > >>>>> Cheers,
> > >>>>> Till
> > >>>>>
> > >>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
> > >>>>> <ch...@apache.org> wrote:
> > >>>>>
> > >>>>>> I found some issues with the standalone-job.sh script.
> > >>>>>>
> > >>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
> > >>>>>> string.
> > >>>>>>
> > >>>>>>       2018-08-08 09:22:34,385 ERROR
> > >>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint
> > -
> > >>>>>>       Could not parse command line arguments [--configDir,
> > >>>>>>       /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
> > >>>>>>       org.apache.flink.runtime.entrypoint.FlinkParseException:
> > >>>>>> Failed to
> > >>>>>>       parse the command line arguments.
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
> > >>>>>>
> > >>>>>>       Caused by: org.apache.commons.cli.MissingOptionException:
> > >>>>>> Missing
> > >>>>>>       required option: j
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
> > >>>>>>                at
> > >>>>>>
> > >>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
> > >>>>>>
> > >>>>>>               ... 1 more
> > >>>>>>
> > >>>>>> The script should fail earlier if no jar is provided, with a
> better
> > >>>>>> error message.
> > >>>>>> It is also undocumented, and the usage instructions don't appear
> > >>>>>> correct.
> > >>>>>>
> > >>>>>> Passing a jar with the -j option leads to a
> ClassNotFoundException:
> > >>>>>> "./bin/standalone-job.sh start -j
> examples/streaming/WordCount.jar"
> > >>>>>>
> > >>>>>>       2018-08-08 09:26:30,562 ERROR
> > >>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint
> > -
> > >>>>>>       Cluster initialization failed.
> > >>>>>>       java.lang.reflect.UndeclaredThrowableException
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
> > >>>>>>
> > >>>>>>       Caused by: org.apache.flink.util.FlinkException: Could not
> > >>>>>> load the
> > >>>>>>       provied entrypoint class.
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
> > >>>>>>
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
> > >>>>>>
> > >>>>>>                at
> > >>>>>> java.security.AccessController.doPrivileged(Native Method)
> > >>>>>>                at
> javax.security.auth.Subject.doAs(Subject.java:422)
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
> > >>>>>>
> > >>>>>>                ... 3 more
> > >>>>>>       Caused by: java.lang.ClassNotFoundException:
> > >>>>>>       examples/streaming/WordCount.jar
> > >>>>>>                at
> > >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> > >>>>>>                at
> > >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> > >>>>>>                at
> > >>>>>>
>  sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
> > >>>>>>                at
> > >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> > >>>>>>                at
> > >>>>>>
> > >>>>>>
> >
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
> > >>>>>>
> > >>>>>>                ... 11 more
> > >>>>>>
> > >>>>>> So this seems to not work at all, but maybe I'm using it wrong?
> > >>>>>>
> > >>>>>> (There's also typo in "Could not load the provied entrypoint
> class")
> > >>>>>>
> > >>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
> > >>>>>>> +1 from my side
> > >>>>>>>
> > >>>>>>> I’ve spent some time playing around with various examples
> > (batching,
> > >>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment,
> with
> > >>>>>> different configuration options (number of task
> > >>>>>> managers/memory/Flip6/credit base flow control/metrics) and
> > >>>>>> everything
> > >>>>>> looks now fine (after fixing
> > >>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
> > >>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
> > >>>>>>> Piotrek
> > >>>>>>>
> > >>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
> > >>>>>>>> wrote:
> > >>>>>>>>
> > >>>>>>>> Hi everyone,
> > >>>>>>>> Please review and vote on the release candidate #4 for the
> version
> > >>>>>> 1.6.0,
> > >>>>>>>> as follows:
> > >>>>>>>> [ ] +1, Approve the release
> > >>>>>>>> [ ] -1, Do not approve the release (please provide specific
> > >>>>>>>> comments)
> > >>>>>>>>
> > >>>>>>>>
> > >>>>>>>> The complete staging area is available for your review, which
> > >>>>>>>> includes:
> > >>>>>>>> * JIRA release notes [1],
> > >>>>>>>> * the official Apache source release and binary convenience
> > >>>>>>>> releases to
> > >>>>>> be
> > >>>>>>>> deployed to dist.apache.org [2], which are signed with the key
> > with
> > >>>>>>>> fingerprint 1F302569A96CFFD5 [3],
> > >>>>>>>> * all artifacts to be deployed to the Maven Central Repository
> > [4],
> > >>>>>>>> * source code tag "release-1.6.0-rc4" [5],
> > >>>>>>>> * website pull request listing the new release and adding
> > >>>>>>>> announcement
> > >>>>>> blog
> > >>>>>>>> post [6].
> > >>>>>>>>
> > >>>>>>>> Please use this document for coordinating testing efforts: [7]
> > >>>>>>>>
> > >>>>>>>> The vote will be shortened since we only adde a minor fix on top
> > >>>>>>>> of the
> > >>>>>> RC
> > >>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by
> > majority
> > >>>>>>>> approval, with at least 3 PMC affirmative votes.
> > >>>>>>>>
> > >>>>>>>> Thanks,
> > >>>>>>>> Your friendly Release Manager
> > >>>>>>>>
> > >>>>>>>> [1]
> > >>>>>>>>
> > >>>>>>
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
> > >>>>>>
> > >>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
> > >>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > >>>>>>>> [4]
> > >>>>>>
> > https://repository.apache.org/content/repositories/orgapacheflink-1178
> > >>>>>>
> > >>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
> > >>>>>>>> [6] https://github.com/apache/flink-web/pull/117
> > >>>>>>>> [7]
> > >>>>>>>>
> > >>>>>>
> >
> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
> > >>>>>>
> > >>>>>>>> Pro-tip: you can create a settings.xml file with these contents:
> > >>>>>>>>
> > >>>>>>>> <settings>
> > >>>>>>>> <activeProfiles>
> > >>>>>>>>     <activeProfile>flink-1.6.0</activeProfile>
> > >>>>>>>> </activeProfiles>
> > >>>>>>>> <profiles>
> > >>>>>>>>     <profile>
> > >>>>>>>>       <id>flink-1.6.0</id>
> > >>>>>>>>       <repositories>
> > >>>>>>>>         <repository>
> > >>>>>>>>           <id>flink-1.6.0</id>
> > >>>>>>>>           <url>
> > >>>>>>>>
> > >>>>>>>>
> > https://repository.apache.org/content/repositories/orgapacheflink-1178/
> > >>>>>>>>
> > >>>>>>>>           </url>
> > >>>>>>>>         </repository>
> > >>>>>>>>         <repository>
> > >>>>>>>>           <id>archetype</id>
> > >>>>>>>>           <url>
> > >>>>>>>>
> > >>>>>>>>
> > https://repository.apache.org/content/repositories/orgapacheflink-1178/
> > >>>>>>>>
> > >>>>>>>>           </url>
> > >>>>>>>>         </repository>
> > >>>>>>>>       </repositories>
> > >>>>>>>>     </profile>
> > >>>>>>>> </profiles>
> > >>>>>>>> </settings>
> > >>>>>>>>
> > >>>>>>>> And reference that in you maven commands via --settings
> > >>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
> > >>>>>>>> based on
> > >>>>>> the
> > >>>>>>>> staged release and for building against the staged jars.
> > >>
> >
> >
>

Re: [VOTE] Release 1.6.0, release candidate #4

Posted by vino yang <ya...@gmail.com>.
+1

- checkout 1.6 source code and successfully run `mvn clean package
-DskipTests`
- searched '1.5' and '1.5.2' in all modules pom file and successfully
verified flink version was changed
- successfully run table and sql test locally

Thanks, vino.


Timo Walther <tw...@apache.org> 于2018年8月9日周四 上午3:24写道:

> +1
>
> - successfully run `mvn clean verify` locally
> - successfully run end-to-end tests locally (except for SQL Client
> end-to-end test)
>
> Found a bug in the class loading of SQL JAR files. This is not a blocker
> but a bug that we should fix soon. As an easy workaround user should not
> use different Kafka versions as SQL Client dependencies.
>
> Regards,
> Timo
>
> Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> > +1
> >
> > - verified compilation, tests
> > - verified checksum and gpg files
> > - verified sbt templates (g8, quickstart) - run assemblies on local
> cluster
> >
> > - I could not execute the nightly-tests.sh though. The tests that were
> > failing most often are:
> >      - test_streaming_file_sink.sh
> >      - test_streaming_elasticsearch.sh
> >
> > Those are connectors though and it might be only tests flakiness so I
> > think it should not block the release.
> >
> > On 08/08/18 16:36, Chesnay Schepler wrote:
> >> I did not use the tools/list_deps.py script as I wasn't aware that it
> >> existed.
> >>
> >> Even if I were I wouldn't have used it and in fact would advocate for
> >> removing it.
> >> It manually parses and constructs dependency information which is
> >> utterly unnecessary as maven already provides this functionality, with
> >> the added bonus of also accounting for dependencyManagement and
> >> transitive dependencies which we obviously have to take into account.
> >>
> >> I used this one-liner instead:
> >> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> >> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> >> |'s/:[a-z]*$//g'| || ||sort| |-u
> >>
> >> |which I have documented here:
> >> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
> >>
> >> On 08.08.2018 15:06, Aljoscha Krettek wrote:
> >>> +1
> >>>
> >>> - verified checksum and gpg files
> >>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
> >>> had one unnecessary part removed
> >>>
> >>> Side comment: I'm not sure whether the "Verify that the LICENSE and
> >>> NOTICE file is correct for the binary and source releases" part is
> >>> valid anymore because we only have one LICENSE and NOTICE file. also
> >>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
> >>> to the binary distribution and mention all of Flink's Maven
> >>> dependencies as well" can be dropped because we don't have them
> anymore.
> >>>
> >>> I came to the same conclusion on dependencies. I used
> >>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
> >>> probably what Chesnay also did ... :-)
> >>>
> >>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org>
> wrote:
> >>>>
> >>>> +1
> >>>>
> >>>> - verified source release contains no binaries
> >>>> - verified correct versions in source release
> >>>> - verified compilation, tests and E2E-tests pass (on travis)
> >>>> - verified checksum and gpg files
> >>>>
> >>>> New dependencies (excluding dependencies where we simply depend on a
> >>>> different version now):
> >>>>      Apache licensed:
> >>>>          io.confluent:common-utils:jar:3.3.1
> >>>>          io.confluent:kafka-schema-registry-client:jar:3.3.1
> >>>>          io.prometheus:simpleclient_pushgateway:jar:0.3.0
> >>>>          various Apache Nifi dependencies
> >>>>          various Apache Parquet dependencies
> >>>>          various ElasticSearch dependencies
> >>>>      CDDL:
> >>>>          javax.ws.rs:javax.ws.rs-api:jar:2.1
> >>>>      Bouncycastle (MIT-like):
> >>>>          org.bouncycastle:bcpkix-jdk15on:jar:1.59
> >>>>          org.bouncycastle:bcprov-jdk15on:jar:1.59
> >>>>      MIT:
> >>>>          org.projectlombok:lombok:jar:1.16.20
> >>>>
> >>>> On 08.08.2018 13:28, Till Rohrmann wrote:
> >>>>> Thanks for reporting these problems Chesnay. The usage string in
> >>>>> `standalone-job.sh` is out dated and should be updated. The same
> >>>>> applies to
> >>>>> the typo.
> >>>>>
> >>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
> >>>>> please
> >>>>> make sure that the user code jar is contained in the classpath (e.g.
> >>>>> putting the jar in the lib directory). Documenting this behaviour
> >>>>> is part
> >>>>> of the pending issue FLINK-10001.
> >>>>>
> >>>>> We should fix all of these issues. They are, however, no release
> >>>>> blockers.
> >>>>>
> >>>>> Cheers,
> >>>>> Till
> >>>>>
> >>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
> >>>>> <ch...@apache.org> wrote:
> >>>>>
> >>>>>> I found some issues with the standalone-job.sh script.
> >>>>>>
> >>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
> >>>>>> string.
> >>>>>>
> >>>>>>       2018-08-08 09:22:34,385 ERROR
> >>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint
> -
> >>>>>>       Could not parse command line arguments [--configDir,
> >>>>>>       /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
> >>>>>>       org.apache.flink.runtime.entrypoint.FlinkParseException:
> >>>>>> Failed to
> >>>>>>       parse the command line arguments.
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
> >>>>>>
> >>>>>>       Caused by: org.apache.commons.cli.MissingOptionException:
> >>>>>> Missing
> >>>>>>       required option: j
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
> >>>>>>                at
> >>>>>>
> >>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
> >>>>>>
> >>>>>>               ... 1 more
> >>>>>>
> >>>>>> The script should fail earlier if no jar is provided, with a better
> >>>>>> error message.
> >>>>>> It is also undocumented, and the usage instructions don't appear
> >>>>>> correct.
> >>>>>>
> >>>>>> Passing a jar with the -j option leads to a ClassNotFoundException:
> >>>>>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
> >>>>>>
> >>>>>>       2018-08-08 09:26:30,562 ERROR
> >>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint
> -
> >>>>>>       Cluster initialization failed.
> >>>>>>       java.lang.reflect.UndeclaredThrowableException
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
> >>>>>>
> >>>>>>       Caused by: org.apache.flink.util.FlinkException: Could not
> >>>>>> load the
> >>>>>>       provied entrypoint class.
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
> >>>>>>
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
> >>>>>>
> >>>>>>                at
> >>>>>> java.security.AccessController.doPrivileged(Native Method)
> >>>>>>                at javax.security.auth.Subject.doAs(Subject.java:422)
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
> >>>>>>
> >>>>>>                ... 3 more
> >>>>>>       Caused by: java.lang.ClassNotFoundException:
> >>>>>>       examples/streaming/WordCount.jar
> >>>>>>                at
> >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >>>>>>                at
> >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >>>>>>                at
> >>>>>>       sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
> >>>>>>                at
> >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> >>>>>>                at
> >>>>>>
> >>>>>>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
> >>>>>>
> >>>>>>                ... 11 more
> >>>>>>
> >>>>>> So this seems to not work at all, but maybe I'm using it wrong?
> >>>>>>
> >>>>>> (There's also typo in "Could not load the provied entrypoint class")
> >>>>>>
> >>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
> >>>>>>> +1 from my side
> >>>>>>>
> >>>>>>> I’ve spent some time playing around with various examples
> (batching,
> >>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
> >>>>>> different configuration options (number of task
> >>>>>> managers/memory/Flip6/credit base flow control/metrics) and
> >>>>>> everything
> >>>>>> looks now fine (after fixing
> >>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
> >>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
> >>>>>>> Piotrek
> >>>>>>>
> >>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
> >>>>>>>> wrote:
> >>>>>>>>
> >>>>>>>> Hi everyone,
> >>>>>>>> Please review and vote on the release candidate #4 for the version
> >>>>>> 1.6.0,
> >>>>>>>> as follows:
> >>>>>>>> [ ] +1, Approve the release
> >>>>>>>> [ ] -1, Do not approve the release (please provide specific
> >>>>>>>> comments)
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> The complete staging area is available for your review, which
> >>>>>>>> includes:
> >>>>>>>> * JIRA release notes [1],
> >>>>>>>> * the official Apache source release and binary convenience
> >>>>>>>> releases to
> >>>>>> be
> >>>>>>>> deployed to dist.apache.org [2], which are signed with the key
> with
> >>>>>>>> fingerprint 1F302569A96CFFD5 [3],
> >>>>>>>> * all artifacts to be deployed to the Maven Central Repository
> [4],
> >>>>>>>> * source code tag "release-1.6.0-rc4" [5],
> >>>>>>>> * website pull request listing the new release and adding
> >>>>>>>> announcement
> >>>>>> blog
> >>>>>>>> post [6].
> >>>>>>>>
> >>>>>>>> Please use this document for coordinating testing efforts: [7]
> >>>>>>>>
> >>>>>>>> The vote will be shortened since we only adde a minor fix on top
> >>>>>>>> of the
> >>>>>> RC
> >>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by
> majority
> >>>>>>>> approval, with at least 3 PMC affirmative votes.
> >>>>>>>>
> >>>>>>>> Thanks,
> >>>>>>>> Your friendly Release Manager
> >>>>>>>>
> >>>>>>>> [1]
> >>>>>>>>
> >>>>>>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
> >>>>>>
> >>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
> >>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> >>>>>>>> [4]
> >>>>>>
> https://repository.apache.org/content/repositories/orgapacheflink-1178
> >>>>>>
> >>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
> >>>>>>>> [6] https://github.com/apache/flink-web/pull/117
> >>>>>>>> [7]
> >>>>>>>>
> >>>>>>
> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
> >>>>>>
> >>>>>>>> Pro-tip: you can create a settings.xml file with these contents:
> >>>>>>>>
> >>>>>>>> <settings>
> >>>>>>>> <activeProfiles>
> >>>>>>>>     <activeProfile>flink-1.6.0</activeProfile>
> >>>>>>>> </activeProfiles>
> >>>>>>>> <profiles>
> >>>>>>>>     <profile>
> >>>>>>>>       <id>flink-1.6.0</id>
> >>>>>>>>       <repositories>
> >>>>>>>>         <repository>
> >>>>>>>>           <id>flink-1.6.0</id>
> >>>>>>>>           <url>
> >>>>>>>>
> >>>>>>>>
> https://repository.apache.org/content/repositories/orgapacheflink-1178/
> >>>>>>>>
> >>>>>>>>           </url>
> >>>>>>>>         </repository>
> >>>>>>>>         <repository>
> >>>>>>>>           <id>archetype</id>
> >>>>>>>>           <url>
> >>>>>>>>
> >>>>>>>>
> https://repository.apache.org/content/repositories/orgapacheflink-1178/
> >>>>>>>>
> >>>>>>>>           </url>
> >>>>>>>>         </repository>
> >>>>>>>>       </repositories>
> >>>>>>>>     </profile>
> >>>>>>>> </profiles>
> >>>>>>>> </settings>
> >>>>>>>>
> >>>>>>>> And reference that in you maven commands via --settings
> >>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
> >>>>>>>> based on
> >>>>>> the
> >>>>>>>> staged release and for building against the staged jars.
> >>
>
>

Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Timo Walther <tw...@apache.org>.
+1

- successfully run `mvn clean verify` locally
- successfully run end-to-end tests locally (except for SQL Client 
end-to-end test)

Found a bug in the class loading of SQL JAR files. This is not a blocker 
but a bug that we should fix soon. As an easy workaround user should not 
use different Kafka versions as SQL Client dependencies.

Regards,
Timo

Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> +1
>
> - verified compilation, tests
> - verified checksum and gpg files
> - verified sbt templates (g8, quickstart) - run assemblies on local cluster
>
> - I could not execute the nightly-tests.sh though. The tests that were
> failing most often are:
>      - test_streaming_file_sink.sh
>      - test_streaming_elasticsearch.sh
>
> Those are connectors though and it might be only tests flakiness so I
> think it should not block the release.
>
> On 08/08/18 16:36, Chesnay Schepler wrote:
>> I did not use the tools/list_deps.py script as I wasn't aware that it
>> existed.
>>
>> Even if I were I wouldn't have used it and in fact would advocate for
>> removing it.
>> It manually parses and constructs dependency information which is
>> utterly unnecessary as maven already provides this functionality, with
>> the added bonus of also accounting for dependencyManagement and
>> transitive dependencies which we obviously have to take into account.
>>
>> I used this one-liner instead:
>> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
>> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
>> |'s/:[a-z]*$//g'| || ||sort| |-u
>>
>> |which I have documented here:
>> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
>>
>> On 08.08.2018 15:06, Aljoscha Krettek wrote:
>>> +1
>>>
>>> - verified checksum and gpg files
>>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
>>> had one unnecessary part removed
>>>
>>> Side comment: I'm not sure whether the "Verify that the LICENSE and
>>> NOTICE file is correct for the binary and source releases" part is
>>> valid anymore because we only have one LICENSE and NOTICE file. also
>>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
>>> to the binary distribution and mention all of Flink's Maven
>>> dependencies as well" can be dropped because we don't have them anymore.
>>>
>>> I came to the same conclusion on dependencies. I used
>>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
>>> probably what Chesnay also did ... :-)
>>>
>>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org> wrote:
>>>>
>>>> +1
>>>>
>>>> - verified source release contains no binaries
>>>> - verified correct versions in source release
>>>> - verified compilation, tests and E2E-tests pass (on travis)
>>>> - verified checksum and gpg files
>>>>
>>>> New dependencies (excluding dependencies where we simply depend on a
>>>> different version now):
>>>>      Apache licensed:
>>>>          io.confluent:common-utils:jar:3.3.1
>>>>          io.confluent:kafka-schema-registry-client:jar:3.3.1
>>>>          io.prometheus:simpleclient_pushgateway:jar:0.3.0
>>>>          various Apache Nifi dependencies
>>>>          various Apache Parquet dependencies
>>>>          various ElasticSearch dependencies
>>>>      CDDL:
>>>>          javax.ws.rs:javax.ws.rs-api:jar:2.1
>>>>      Bouncycastle (MIT-like):
>>>>          org.bouncycastle:bcpkix-jdk15on:jar:1.59
>>>>          org.bouncycastle:bcprov-jdk15on:jar:1.59
>>>>      MIT:
>>>>          org.projectlombok:lombok:jar:1.16.20
>>>>
>>>> On 08.08.2018 13:28, Till Rohrmann wrote:
>>>>> Thanks for reporting these problems Chesnay. The usage string in
>>>>> `standalone-job.sh` is out dated and should be updated. The same
>>>>> applies to
>>>>> the typo.
>>>>>
>>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
>>>>> please
>>>>> make sure that the user code jar is contained in the classpath (e.g.
>>>>> putting the jar in the lib directory). Documenting this behaviour
>>>>> is part
>>>>> of the pending issue FLINK-10001.
>>>>>
>>>>> We should fix all of these issues. They are, however, no release
>>>>> blockers.
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>>
>>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
>>>>> <ch...@apache.org> wrote:
>>>>>
>>>>>> I found some issues with the standalone-job.sh script.
>>>>>>
>>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
>>>>>> string.
>>>>>>
>>>>>>       2018-08-08 09:22:34,385 ERROR
>>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>>>       Could not parse command line arguments [--configDir,
>>>>>>       /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>>>>>>       org.apache.flink.runtime.entrypoint.FlinkParseException:
>>>>>> Failed to
>>>>>>       parse the command line arguments.
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>>>>>>
>>>>>>       Caused by: org.apache.commons.cli.MissingOptionException:
>>>>>> Missing
>>>>>>       required option: j
>>>>>>                at
>>>>>>
>>>>>> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>>>>>>
>>>>>>                at
>>>>>>      
>>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>>>>>>                at
>>>>>>      
>>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>>>>>>
>>>>>>               ... 1 more
>>>>>>
>>>>>> The script should fail earlier if no jar is provided, with a better
>>>>>> error message.
>>>>>> It is also undocumented, and the usage instructions don't appear
>>>>>> correct.
>>>>>>
>>>>>> Passing a jar with the -j option leads to a ClassNotFoundException:
>>>>>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
>>>>>>
>>>>>>       2018-08-08 09:26:30,562 ERROR
>>>>>>       org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>>>       Cluster initialization failed.
>>>>>>       java.lang.reflect.UndeclaredThrowableException
>>>>>>                at
>>>>>>
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>>>>>>
>>>>>>       Caused by: org.apache.flink.util.FlinkException: Could not
>>>>>> load the
>>>>>>       provied entrypoint class.
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>>>>>>
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>>>>>>
>>>>>>                at
>>>>>> java.security.AccessController.doPrivileged(Native Method)
>>>>>>                at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>>>                at
>>>>>>
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>>>>
>>>>>>                ... 3 more
>>>>>>       Caused by: java.lang.ClassNotFoundException:
>>>>>>       examples/streaming/WordCount.jar
>>>>>>                at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>>>                at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>>>                at
>>>>>>       sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>>>>>>                at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>>>                at
>>>>>>
>>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>>>>>>
>>>>>>                ... 11 more
>>>>>>
>>>>>> So this seems to not work at all, but maybe I'm using it wrong?
>>>>>>
>>>>>> (There's also typo in "Could not load the provied entrypoint class")
>>>>>>
>>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
>>>>>>> +1 from my side
>>>>>>>
>>>>>>> I’ve spent some time playing around with various examples (batching,
>>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
>>>>>> different configuration options (number of task
>>>>>> managers/memory/Flip6/credit base flow control/metrics) and
>>>>>> everything
>>>>>> looks now fine (after fixing
>>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
>>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
>>>>>>> Piotrek
>>>>>>>
>>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>> Hi everyone,
>>>>>>>> Please review and vote on the release candidate #4 for the version
>>>>>> 1.6.0,
>>>>>>>> as follows:
>>>>>>>> [ ] +1, Approve the release
>>>>>>>> [ ] -1, Do not approve the release (please provide specific
>>>>>>>> comments)
>>>>>>>>
>>>>>>>>
>>>>>>>> The complete staging area is available for your review, which
>>>>>>>> includes:
>>>>>>>> * JIRA release notes [1],
>>>>>>>> * the official Apache source release and binary convenience
>>>>>>>> releases to
>>>>>> be
>>>>>>>> deployed to dist.apache.org [2], which are signed with the key with
>>>>>>>> fingerprint 1F302569A96CFFD5 [3],
>>>>>>>> * all artifacts to be deployed to the Maven Central Repository [4],
>>>>>>>> * source code tag "release-1.6.0-rc4" [5],
>>>>>>>> * website pull request listing the new release and adding
>>>>>>>> announcement
>>>>>> blog
>>>>>>>> post [6].
>>>>>>>>
>>>>>>>> Please use this document for coordinating testing efforts: [7]
>>>>>>>>
>>>>>>>> The vote will be shortened since we only adde a minor fix on top
>>>>>>>> of the
>>>>>> RC
>>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
>>>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Your friendly Release Manager
>>>>>>>>
>>>>>>>> [1]
>>>>>>>>
>>>>>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>>>>>>
>>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>>>> [4]
>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178
>>>>>>
>>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>>>>>>>> [6] https://github.com/apache/flink-web/pull/117
>>>>>>>> [7]
>>>>>>>>
>>>>>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>>>>>>
>>>>>>>> Pro-tip: you can create a settings.xml file with these contents:
>>>>>>>>
>>>>>>>> <settings>
>>>>>>>> <activeProfiles>
>>>>>>>>     <activeProfile>flink-1.6.0</activeProfile>
>>>>>>>> </activeProfiles>
>>>>>>>> <profiles>
>>>>>>>>     <profile>
>>>>>>>>       <id>flink-1.6.0</id>
>>>>>>>>       <repositories>
>>>>>>>>         <repository>
>>>>>>>>           <id>flink-1.6.0</id>
>>>>>>>>           <url>
>>>>>>>>
>>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>>>
>>>>>>>>           </url>
>>>>>>>>         </repository>
>>>>>>>>         <repository>
>>>>>>>>           <id>archetype</id>
>>>>>>>>           <url>
>>>>>>>>
>>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>>>
>>>>>>>>           </url>
>>>>>>>>         </repository>
>>>>>>>>       </repositories>
>>>>>>>>     </profile>
>>>>>>>> </profiles>
>>>>>>>> </settings>
>>>>>>>>
>>>>>>>> And reference that in you maven commands via --settings
>>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
>>>>>>>> based on
>>>>>> the
>>>>>>>> staged release and for building against the staged jars.
>>


Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Dawid Wysakowicz <dw...@apache.org>.
+1

- verified compilation, tests
- verified checksum and gpg files
- verified sbt templates (g8, quickstart) - run assemblies on local cluster

- I could not execute the nightly-tests.sh though. The tests that were
failing most often are:
    - test_streaming_file_sink.sh
    - test_streaming_elasticsearch.sh

Those are connectors though and it might be only tests flakiness so I
think it should not block the release.

On 08/08/18 16:36, Chesnay Schepler wrote:
> I did not use the tools/list_deps.py script as I wasn't aware that it
> existed.
>
> Even if I were I wouldn't have used it and in fact would advocate for
> removing it.
> It manually parses and constructs dependency information which is
> utterly unnecessary as maven already provides this functionality, with
> the added bonus of also accounting for dependencyManagement and
> transitive dependencies which we obviously have to take into account.
>
> I used this one-liner instead:
> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> |'s/:[a-z]*$//g'| || ||sort| |-u
>
> |which I have documented here:
> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
>
> On 08.08.2018 15:06, Aljoscha Krettek wrote:
>> +1
>>
>> - verified checksum and gpg files
>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
>> had one unnecessary part removed
>>
>> Side comment: I'm not sure whether the "Verify that the LICENSE and
>> NOTICE file is correct for the binary and source releases" part is
>> valid anymore because we only have one LICENSE and NOTICE file. also
>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
>> to the binary distribution and mention all of Flink's Maven
>> dependencies as well" can be dropped because we don't have them anymore.
>>
>> I came to the same conclusion on dependencies. I used
>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
>> probably what Chesnay also did ... :-)
>>
>>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org> wrote:
>>>
>>> +1
>>>
>>> - verified source release contains no binaries
>>> - verified correct versions in source release
>>> - verified compilation, tests and E2E-tests pass (on travis)
>>> - verified checksum and gpg files
>>>
>>> New dependencies (excluding dependencies where we simply depend on a
>>> different version now):
>>>     Apache licensed:
>>>         io.confluent:common-utils:jar:3.3.1
>>>         io.confluent:kafka-schema-registry-client:jar:3.3.1
>>>         io.prometheus:simpleclient_pushgateway:jar:0.3.0
>>>         various Apache Nifi dependencies
>>>         various Apache Parquet dependencies
>>>         various ElasticSearch dependencies
>>>     CDDL:
>>>         javax.ws.rs:javax.ws.rs-api:jar:2.1
>>>     Bouncycastle (MIT-like):
>>>         org.bouncycastle:bcpkix-jdk15on:jar:1.59
>>>         org.bouncycastle:bcprov-jdk15on:jar:1.59
>>>     MIT:
>>>         org.projectlombok:lombok:jar:1.16.20
>>>
>>> On 08.08.2018 13:28, Till Rohrmann wrote:
>>>> Thanks for reporting these problems Chesnay. The usage string in
>>>> `standalone-job.sh` is out dated and should be updated. The same
>>>> applies to
>>>> the typo.
>>>>
>>>> When calling `standalone-job.sh start --job-classname foobar.Job`
>>>> please
>>>> make sure that the user code jar is contained in the classpath (e.g.
>>>> putting the jar in the lib directory). Documenting this behaviour
>>>> is part
>>>> of the pending issue FLINK-10001.
>>>>
>>>> We should fix all of these issues. They are, however, no release
>>>> blockers.
>>>>
>>>> Cheers,
>>>> Till
>>>>
>>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
>>>> <ch...@apache.org> wrote:
>>>>
>>>>> I found some issues with the standalone-job.sh script.
>>>>>
>>>>> I ran "./bin/standalone-job.sh start" as described by the usage
>>>>> string.
>>>>>
>>>>>      2018-08-08 09:22:34,385 ERROR
>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>>      Could not parse command line arguments [--configDir,
>>>>>      /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>>>>>      org.apache.flink.runtime.entrypoint.FlinkParseException:
>>>>> Failed to
>>>>>      parse the command line arguments.
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>>>>>
>>>>>      Caused by: org.apache.commons.cli.MissingOptionException:
>>>>> Missing
>>>>>      required option: j
>>>>>               at
>>>>>
>>>>> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>>>>>
>>>>>               at
>>>>>     
>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>>>>>               at
>>>>>     
>>>>> org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>>>>>
>>>>>              ... 1 more
>>>>>
>>>>> The script should fail earlier if no jar is provided, with a better
>>>>> error message.
>>>>> It is also undocumented, and the usage instructions don't appear
>>>>> correct.
>>>>>
>>>>> Passing a jar with the -j option leads to a ClassNotFoundException:
>>>>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
>>>>>
>>>>>      2018-08-08 09:26:30,562 ERROR
>>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>>      Cluster initialization failed.
>>>>>      java.lang.reflect.UndeclaredThrowableException
>>>>>               at
>>>>>
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>>>>>
>>>>>      Caused by: org.apache.flink.util.FlinkException: Could not
>>>>> load the
>>>>>      provied entrypoint class.
>>>>>               at
>>>>>
>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>>>>>
>>>>>               at
>>>>>
>>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>>>>>
>>>>>               at
>>>>> java.security.AccessController.doPrivileged(Native Method)
>>>>>               at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>>               at
>>>>>
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>>>
>>>>>               ... 3 more
>>>>>      Caused by: java.lang.ClassNotFoundException:
>>>>>      examples/streaming/WordCount.jar
>>>>>               at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>>               at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>>               at
>>>>>      sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>>>>>               at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>>               at
>>>>>
>>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>>>>>
>>>>>               ... 11 more
>>>>>
>>>>> So this seems to not work at all, but maybe I'm using it wrong?
>>>>>
>>>>> (There's also typo in "Could not load the provied entrypoint class")
>>>>>
>>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
>>>>>> +1 from my side
>>>>>>
>>>>>> I’ve spent some time playing around with various examples (batching,
>>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
>>>>> different configuration options (number of task
>>>>> managers/memory/Flip6/credit base flow control/metrics) and
>>>>> everything
>>>>> looks now fine (after fixing
>>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
>>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
>>>>>> Piotrek
>>>>>>
>>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org>
>>>>>>> wrote:
>>>>>>>
>>>>>>> Hi everyone,
>>>>>>> Please review and vote on the release candidate #4 for the version
>>>>> 1.6.0,
>>>>>>> as follows:
>>>>>>> [ ] +1, Approve the release
>>>>>>> [ ] -1, Do not approve the release (please provide specific
>>>>>>> comments)
>>>>>>>
>>>>>>>
>>>>>>> The complete staging area is available for your review, which
>>>>>>> includes:
>>>>>>> * JIRA release notes [1],
>>>>>>> * the official Apache source release and binary convenience
>>>>>>> releases to
>>>>> be
>>>>>>> deployed to dist.apache.org [2], which are signed with the key with
>>>>>>> fingerprint 1F302569A96CFFD5 [3],
>>>>>>> * all artifacts to be deployed to the Maven Central Repository [4],
>>>>>>> * source code tag "release-1.6.0-rc4" [5],
>>>>>>> * website pull request listing the new release and adding
>>>>>>> announcement
>>>>> blog
>>>>>>> post [6].
>>>>>>>
>>>>>>> Please use this document for coordinating testing efforts: [7]
>>>>>>>
>>>>>>> The vote will be shortened since we only adde a minor fix on top
>>>>>>> of the
>>>>> RC
>>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
>>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Your friendly Release Manager
>>>>>>>
>>>>>>> [1]
>>>>>>>
>>>>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>>>>>
>>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>>> [4]
>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178
>>>>>
>>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>>>>>>> [6] https://github.com/apache/flink-web/pull/117
>>>>>>> [7]
>>>>>>>
>>>>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>>>>>
>>>>>>> Pro-tip: you can create a settings.xml file with these contents:
>>>>>>>
>>>>>>> <settings>
>>>>>>> <activeProfiles>
>>>>>>>    <activeProfile>flink-1.6.0</activeProfile>
>>>>>>> </activeProfiles>
>>>>>>> <profiles>
>>>>>>>    <profile>
>>>>>>>      <id>flink-1.6.0</id>
>>>>>>>      <repositories>
>>>>>>>        <repository>
>>>>>>>          <id>flink-1.6.0</id>
>>>>>>>          <url>
>>>>>>>
>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>>
>>>>>>>          </url>
>>>>>>>        </repository>
>>>>>>>        <repository>
>>>>>>>          <id>archetype</id>
>>>>>>>          <url>
>>>>>>>
>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>>
>>>>>>>          </url>
>>>>>>>        </repository>
>>>>>>>      </repositories>
>>>>>>>    </profile>
>>>>>>> </profiles>
>>>>>>> </settings>
>>>>>>>
>>>>>>> And reference that in you maven commands via --settings
>>>>>>> path/to/settings.xml. This is useful for creating a quickstart
>>>>>>> based on
>>>>> the
>>>>>>> staged release and for building against the staged jars.
>>
>
>


Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Chesnay Schepler <ch...@apache.org>.
I did not use the tools/list_deps.py script as I wasn't aware that it 
existed.

Even if I were I wouldn't have used it and in fact would advocate for 
removing it.
It manually parses and constructs dependency information which is 
utterly unnecessary as maven already provides this functionality, with 
the added bonus of also accounting for dependencyManagement and 
transitive dependencies which we obviously have to take into account.

I used this one-liner instead:
|mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e 
||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed| 
|'s/:[a-z]*$//g'| || ||sort| |-u

|which I have documented here: 
https://cwiki.apache.org/confluence/display/FLINK/Dependencies

On 08.08.2018 15:06, Aljoscha Krettek wrote:
> +1
>
> - verified checksum and gpg files
> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE had one unnecessary part removed
>
> Side comment: I'm not sure whether the "Verify that the LICENSE and NOTICE file is correct for the binary and source releases" part is valid anymore because we only have one LICENSE and NOTICE file. also "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer to the binary distribution and mention all of Flink's Maven dependencies as well" can be dropped because we don't have them anymore.
>
> I came to the same conclusion on dependencies. I used tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's probably what Chesnay also did ... :-)
>
>> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org> wrote:
>>
>> +1
>>
>> - verified source release contains no binaries
>> - verified correct versions in source release
>> - verified compilation, tests and E2E-tests pass (on travis)
>> - verified checksum and gpg files
>>
>> New dependencies (excluding dependencies where we simply depend on a different version now):
>>     Apache licensed:
>>         io.confluent:common-utils:jar:3.3.1
>>         io.confluent:kafka-schema-registry-client:jar:3.3.1
>>         io.prometheus:simpleclient_pushgateway:jar:0.3.0
>>         various Apache Nifi dependencies
>>         various Apache Parquet dependencies
>>         various ElasticSearch dependencies
>>     CDDL:
>>         javax.ws.rs:javax.ws.rs-api:jar:2.1
>>     Bouncycastle (MIT-like):
>>         org.bouncycastle:bcpkix-jdk15on:jar:1.59
>>         org.bouncycastle:bcprov-jdk15on:jar:1.59
>>     MIT:
>>         org.projectlombok:lombok:jar:1.16.20
>>
>> On 08.08.2018 13:28, Till Rohrmann wrote:
>>> Thanks for reporting these problems Chesnay. The usage string in
>>> `standalone-job.sh` is out dated and should be updated. The same applies to
>>> the typo.
>>>
>>> When calling `standalone-job.sh start --job-classname foobar.Job` please
>>> make sure that the user code jar is contained in the classpath (e.g.
>>> putting the jar in the lib directory). Documenting this behaviour is part
>>> of the pending issue FLINK-10001.
>>>
>>> We should fix all of these issues. They are, however, no release blockers.
>>>
>>> Cheers,
>>> Till
>>>
>>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler <ch...@apache.org> wrote:
>>>
>>>> I found some issues with the standalone-job.sh script.
>>>>
>>>> I ran "./bin/standalone-job.sh start" as described by the usage string.
>>>>
>>>>      2018-08-08 09:22:34,385 ERROR
>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>      Could not parse command line arguments [--configDir,
>>>>      /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>>>>      org.apache.flink.runtime.entrypoint.FlinkParseException: Failed to
>>>>      parse the command line arguments.
>>>>               at
>>>>
>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>>>>               at
>>>>
>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>>>>      Caused by: org.apache.commons.cli.MissingOptionException: Missing
>>>>      required option: j
>>>>               at
>>>>
>>>> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>>>>               at
>>>>      org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>>>>               at
>>>>      org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>>>>               at
>>>>
>>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>>>>              ... 1 more
>>>>
>>>> The script should fail earlier if no jar is provided, with a better
>>>> error message.
>>>> It is also undocumented, and the usage instructions don't appear correct.
>>>>
>>>> Passing a jar with the -j option leads to a ClassNotFoundException:
>>>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
>>>>
>>>>      2018-08-08 09:26:30,562 ERROR
>>>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>>      Cluster initialization failed.
>>>>      java.lang.reflect.UndeclaredThrowableException
>>>>               at
>>>>
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>>>>               at
>>>>
>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>               at
>>>>
>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>>>>               at
>>>>
>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>>>>      Caused by: org.apache.flink.util.FlinkException: Could not load the
>>>>      provied entrypoint class.
>>>>               at
>>>>
>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>>>>               at
>>>>
>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>>>>               at
>>>>
>>>> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>>>>               at
>>>>
>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>>>>               at
>>>>
>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>>>>               at
>>>>
>>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>>>>               at java.security.AccessController.doPrivileged(Native Method)
>>>>               at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>               at
>>>>
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>>               ... 3 more
>>>>      Caused by: java.lang.ClassNotFoundException:
>>>>      examples/streaming/WordCount.jar
>>>>               at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>               at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>               at
>>>>      sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>>>>               at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>               at
>>>>
>>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>>>>               ... 11 more
>>>>
>>>> So this seems to not work at all, but maybe I'm using it wrong?
>>>>
>>>> (There's also typo in "Could not load the provied entrypoint class")
>>>>
>>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
>>>>> +1 from my side
>>>>>
>>>>> I’ve spent some time playing around with various examples (batching,
>>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
>>>> different configuration options (number of task
>>>> managers/memory/Flip6/credit base flow control/metrics) and everything
>>>> looks now fine (after fixing
>>>> https://issues.apache.org/jira/browse/FLINK-9969 <
>>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
>>>>> Piotrek
>>>>>
>>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org> wrote:
>>>>>>
>>>>>> Hi everyone,
>>>>>> Please review and vote on the release candidate #4 for the version
>>>> 1.6.0,
>>>>>> as follows:
>>>>>> [ ] +1, Approve the release
>>>>>> [ ] -1, Do not approve the release (please provide specific comments)
>>>>>>
>>>>>>
>>>>>> The complete staging area is available for your review, which includes:
>>>>>> * JIRA release notes [1],
>>>>>> * the official Apache source release and binary convenience releases to
>>>> be
>>>>>> deployed to dist.apache.org [2], which are signed with the key with
>>>>>> fingerprint 1F302569A96CFFD5 [3],
>>>>>> * all artifacts to be deployed to the Maven Central Repository [4],
>>>>>> * source code tag "release-1.6.0-rc4" [5],
>>>>>> * website pull request listing the new release and adding announcement
>>>> blog
>>>>>> post [6].
>>>>>>
>>>>>> Please use this document for coordinating testing efforts: [7]
>>>>>>
>>>>>> The vote will be shortened since we only adde a minor fix on top of the
>>>> RC
>>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>
>>>>>> Thanks,
>>>>>> Your friendly Release Manager
>>>>>>
>>>>>> [1]
>>>>>>
>>>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>> [4]
>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178
>>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>>>>>> [6] https://github.com/apache/flink-web/pull/117
>>>>>> [7]
>>>>>>
>>>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>>>>>> Pro-tip: you can create a settings.xml file with these contents:
>>>>>>
>>>>>> <settings>
>>>>>> <activeProfiles>
>>>>>>    <activeProfile>flink-1.6.0</activeProfile>
>>>>>> </activeProfiles>
>>>>>> <profiles>
>>>>>>    <profile>
>>>>>>      <id>flink-1.6.0</id>
>>>>>>      <repositories>
>>>>>>        <repository>
>>>>>>          <id>flink-1.6.0</id>
>>>>>>          <url>
>>>>>>
>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>          </url>
>>>>>>        </repository>
>>>>>>        <repository>
>>>>>>          <id>archetype</id>
>>>>>>          <url>
>>>>>>
>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>>          </url>
>>>>>>        </repository>
>>>>>>      </repositories>
>>>>>>    </profile>
>>>>>> </profiles>
>>>>>> </settings>
>>>>>>
>>>>>> And reference that in you maven commands via --settings
>>>>>> path/to/settings.xml. This is useful for creating a quickstart based on
>>>> the
>>>>>> staged release and for building against the staged jars.
>


Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Aljoscha Krettek <al...@apache.org>.
+1

- verified checksum and gpg files
- verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE had one unnecessary part removed

Side comment: I'm not sure whether the "Verify that the LICENSE and NOTICE file is correct for the binary and source releases" part is valid anymore because we only have one LICENSE and NOTICE file. also "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer to the binary distribution and mention all of Flink's Maven dependencies as well" can be dropped because we don't have them anymore.

I came to the same conclusion on dependencies. I used tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's probably what Chesnay also did ... :-)

> On 8. Aug 2018, at 14:43, Chesnay Schepler <ch...@apache.org> wrote:
> 
> +1
> 
> - verified source release contains no binaries
> - verified correct versions in source release
> - verified compilation, tests and E2E-tests pass (on travis)
> - verified checksum and gpg files
> 
> New dependencies (excluding dependencies where we simply depend on a different version now):
>    Apache licensed:
>        io.confluent:common-utils:jar:3.3.1
>        io.confluent:kafka-schema-registry-client:jar:3.3.1
>        io.prometheus:simpleclient_pushgateway:jar:0.3.0
>        various Apache Nifi dependencies
>        various Apache Parquet dependencies
>        various ElasticSearch dependencies
>    CDDL:
>        javax.ws.rs:javax.ws.rs-api:jar:2.1
>    Bouncycastle (MIT-like):
>        org.bouncycastle:bcpkix-jdk15on:jar:1.59
>        org.bouncycastle:bcprov-jdk15on:jar:1.59
>    MIT:
>        org.projectlombok:lombok:jar:1.16.20
> 
> On 08.08.2018 13:28, Till Rohrmann wrote:
>> Thanks for reporting these problems Chesnay. The usage string in
>> `standalone-job.sh` is out dated and should be updated. The same applies to
>> the typo.
>> 
>> When calling `standalone-job.sh start --job-classname foobar.Job` please
>> make sure that the user code jar is contained in the classpath (e.g.
>> putting the jar in the lib directory). Documenting this behaviour is part
>> of the pending issue FLINK-10001.
>> 
>> We should fix all of these issues. They are, however, no release blockers.
>> 
>> Cheers,
>> Till
>> 
>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler <ch...@apache.org> wrote:
>> 
>>> I found some issues with the standalone-job.sh script.
>>> 
>>> I ran "./bin/standalone-job.sh start" as described by the usage string.
>>> 
>>>     2018-08-08 09:22:34,385 ERROR
>>>     org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>     Could not parse command line arguments [--configDir,
>>>     /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>>>     org.apache.flink.runtime.entrypoint.FlinkParseException: Failed to
>>>     parse the command line arguments.
>>>              at
>>> 
>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>>>              at
>>> 
>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>>>     Caused by: org.apache.commons.cli.MissingOptionException: Missing
>>>     required option: j
>>>              at
>>> 
>>> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>>>              at
>>>     org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>>>              at
>>>     org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>>>              at
>>> 
>>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>>>             ... 1 more
>>> 
>>> The script should fail earlier if no jar is provided, with a better
>>> error message.
>>> It is also undocumented, and the usage instructions don't appear correct.
>>> 
>>> Passing a jar with the -j option leads to a ClassNotFoundException:
>>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
>>> 
>>>     2018-08-08 09:26:30,562 ERROR
>>>     org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>>     Cluster initialization failed.
>>>     java.lang.reflect.UndeclaredThrowableException
>>>              at
>>> 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>>>              at
>>> 
>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>              at
>>> 
>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>>>              at
>>> 
>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>>>     Caused by: org.apache.flink.util.FlinkException: Could not load the
>>>     provied entrypoint class.
>>>              at
>>> 
>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>>>              at
>>> 
>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>>>              at
>>> 
>>> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>>>              at
>>> 
>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>>>              at
>>> 
>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>>>              at
>>> 
>>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>>>              at java.security.AccessController.doPrivileged(Native Method)
>>>              at javax.security.auth.Subject.doAs(Subject.java:422)
>>>              at
>>> 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>              ... 3 more
>>>     Caused by: java.lang.ClassNotFoundException:
>>>     examples/streaming/WordCount.jar
>>>              at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>              at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>              at
>>>     sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>>>              at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>              at
>>> 
>>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>>>              ... 11 more
>>> 
>>> So this seems to not work at all, but maybe I'm using it wrong?
>>> 
>>> (There's also typo in "Could not load the provied entrypoint class")
>>> 
>>> On 08.08.2018 10:33, Piotr Nowojski wrote:
>>>> +1 from my side
>>>> 
>>>> I’ve spent some time playing around with various examples (batching,
>>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
>>> different configuration options (number of task
>>> managers/memory/Flip6/credit base flow control/metrics) and everything
>>> looks now fine (after fixing
>>> https://issues.apache.org/jira/browse/FLINK-9969 <
>>> https://issues.apache.org/jira/browse/FLINK-9969> ).
>>>> Piotrek
>>>> 
>>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org> wrote:
>>>>> 
>>>>> Hi everyone,
>>>>> Please review and vote on the release candidate #4 for the version
>>> 1.6.0,
>>>>> as follows:
>>>>> [ ] +1, Approve the release
>>>>> [ ] -1, Do not approve the release (please provide specific comments)
>>>>> 
>>>>> 
>>>>> The complete staging area is available for your review, which includes:
>>>>> * JIRA release notes [1],
>>>>> * the official Apache source release and binary convenience releases to
>>> be
>>>>> deployed to dist.apache.org [2], which are signed with the key with
>>>>> fingerprint 1F302569A96CFFD5 [3],
>>>>> * all artifacts to be deployed to the Maven Central Repository [4],
>>>>> * source code tag "release-1.6.0-rc4" [5],
>>>>> * website pull request listing the new release and adding announcement
>>> blog
>>>>> post [6].
>>>>> 
>>>>> Please use this document for coordinating testing efforts: [7]
>>>>> 
>>>>> The vote will be shortened since we only adde a minor fix on top of the
>>> RC
>>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
>>>>> approval, with at least 3 PMC affirmative votes.
>>>>> 
>>>>> Thanks,
>>>>> Your friendly Release Manager
>>>>> 
>>>>> [1]
>>>>> 
>>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>> [4]
>>> https://repository.apache.org/content/repositories/orgapacheflink-1178
>>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>>>>> [6] https://github.com/apache/flink-web/pull/117
>>>>> [7]
>>>>> 
>>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>>>>> Pro-tip: you can create a settings.xml file with these contents:
>>>>> 
>>>>> <settings>
>>>>> <activeProfiles>
>>>>>   <activeProfile>flink-1.6.0</activeProfile>
>>>>> </activeProfiles>
>>>>> <profiles>
>>>>>   <profile>
>>>>>     <id>flink-1.6.0</id>
>>>>>     <repositories>
>>>>>       <repository>
>>>>>         <id>flink-1.6.0</id>
>>>>>         <url>
>>>>> 
>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>         </url>
>>>>>       </repository>
>>>>>       <repository>
>>>>>         <id>archetype</id>
>>>>>         <url>
>>>>> 
>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>>         </url>
>>>>>       </repository>
>>>>>     </repositories>
>>>>>   </profile>
>>>>> </profiles>
>>>>> </settings>
>>>>> 
>>>>> And reference that in you maven commands via --settings
>>>>> path/to/settings.xml. This is useful for creating a quickstart based on
>>> the
>>>>> staged release and for building against the staged jars.
>>> 
> 


Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Chesnay Schepler <ch...@apache.org>.
+1

- verified source release contains no binaries
- verified correct versions in source release
- verified compilation, tests and E2E-tests pass (on travis)
- verified checksum and gpg files

New dependencies (excluding dependencies where we simply depend on a 
different version now):
     Apache licensed:
         io.confluent:common-utils:jar:3.3.1
         io.confluent:kafka-schema-registry-client:jar:3.3.1
         io.prometheus:simpleclient_pushgateway:jar:0.3.0
         various Apache Nifi dependencies
         various Apache Parquet dependencies
         various ElasticSearch dependencies
     CDDL:
         javax.ws.rs:javax.ws.rs-api:jar:2.1
     Bouncycastle (MIT-like):
         org.bouncycastle:bcpkix-jdk15on:jar:1.59
         org.bouncycastle:bcprov-jdk15on:jar:1.59
     MIT:
         org.projectlombok:lombok:jar:1.16.20

On 08.08.2018 13:28, Till Rohrmann wrote:
> Thanks for reporting these problems Chesnay. The usage string in
> `standalone-job.sh` is out dated and should be updated. The same applies to
> the typo.
>
> When calling `standalone-job.sh start --job-classname foobar.Job` please
> make sure that the user code jar is contained in the classpath (e.g.
> putting the jar in the lib directory). Documenting this behaviour is part
> of the pending issue FLINK-10001.
>
> We should fix all of these issues. They are, however, no release blockers.
>
> Cheers,
> Till
>
> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler <ch...@apache.org> wrote:
>
>> I found some issues with the standalone-job.sh script.
>>
>> I ran "./bin/standalone-job.sh start" as described by the usage string.
>>
>>      2018-08-08 09:22:34,385 ERROR
>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>      Could not parse command line arguments [--configDir,
>>      /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>>      org.apache.flink.runtime.entrypoint.FlinkParseException: Failed to
>>      parse the command line arguments.
>>               at
>>
>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>>               at
>>
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>>      Caused by: org.apache.commons.cli.MissingOptionException: Missing
>>      required option: j
>>               at
>>
>> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>>               at
>>      org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>>               at
>>      org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>>               at
>>
>> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>>              ... 1 more
>>
>> The script should fail earlier if no jar is provided, with a better
>> error message.
>> It is also undocumented, and the usage instructions don't appear correct.
>>
>> Passing a jar with the -j option leads to a ClassNotFoundException:
>> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
>>
>>      2018-08-08 09:26:30,562 ERROR
>>      org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>>      Cluster initialization failed.
>>      java.lang.reflect.UndeclaredThrowableException
>>               at
>>
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>>               at
>>
>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>               at
>>
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>>               at
>>
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>>      Caused by: org.apache.flink.util.FlinkException: Could not load the
>>      provied entrypoint class.
>>               at
>>
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>>               at
>>
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>>               at
>>
>> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>>               at
>>
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>>               at
>>
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>>               at
>>
>> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>>               at java.security.AccessController.doPrivileged(Native Method)
>>               at javax.security.auth.Subject.doAs(Subject.java:422)
>>               at
>>
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>               ... 3 more
>>      Caused by: java.lang.ClassNotFoundException:
>>      examples/streaming/WordCount.jar
>>               at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>               at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>               at
>>      sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>>               at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>               at
>>
>> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>>               ... 11 more
>>
>> So this seems to not work at all, but maybe I'm using it wrong?
>>
>> (There's also typo in "Could not load the provied entrypoint class")
>>
>> On 08.08.2018 10:33, Piotr Nowojski wrote:
>>> +1 from my side
>>>
>>> I’ve spent some time playing around with various examples (batching,
>> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
>> different configuration options (number of task
>> managers/memory/Flip6/credit base flow control/metrics) and everything
>> looks now fine (after fixing
>> https://issues.apache.org/jira/browse/FLINK-9969 <
>> https://issues.apache.org/jira/browse/FLINK-9969> ).
>>> Piotrek
>>>
>>>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org> wrote:
>>>>
>>>> Hi everyone,
>>>> Please review and vote on the release candidate #4 for the version
>> 1.6.0,
>>>> as follows:
>>>> [ ] +1, Approve the release
>>>> [ ] -1, Do not approve the release (please provide specific comments)
>>>>
>>>>
>>>> The complete staging area is available for your review, which includes:
>>>> * JIRA release notes [1],
>>>> * the official Apache source release and binary convenience releases to
>> be
>>>> deployed to dist.apache.org [2], which are signed with the key with
>>>> fingerprint 1F302569A96CFFD5 [3],
>>>> * all artifacts to be deployed to the Maven Central Repository [4],
>>>> * source code tag "release-1.6.0-rc4" [5],
>>>> * website pull request listing the new release and adding announcement
>> blog
>>>> post [6].
>>>>
>>>> Please use this document for coordinating testing efforts: [7]
>>>>
>>>> The vote will be shortened since we only adde a minor fix on top of the
>> RC
>>>> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
>>>> approval, with at least 3 PMC affirmative votes.
>>>>
>>>> Thanks,
>>>> Your friendly Release Manager
>>>>
>>>> [1]
>>>>
>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>>>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>> [4]
>> https://repository.apache.org/content/repositories/orgapacheflink-1178
>>>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>>>> [6] https://github.com/apache/flink-web/pull/117
>>>> [7]
>>>>
>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>>>> Pro-tip: you can create a settings.xml file with these contents:
>>>>
>>>> <settings>
>>>> <activeProfiles>
>>>>    <activeProfile>flink-1.6.0</activeProfile>
>>>> </activeProfiles>
>>>> <profiles>
>>>>    <profile>
>>>>      <id>flink-1.6.0</id>
>>>>      <repositories>
>>>>        <repository>
>>>>          <id>flink-1.6.0</id>
>>>>          <url>
>>>>
>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>          </url>
>>>>        </repository>
>>>>        <repository>
>>>>          <id>archetype</id>
>>>>          <url>
>>>>
>>>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>>>          </url>
>>>>        </repository>
>>>>      </repositories>
>>>>    </profile>
>>>> </profiles>
>>>> </settings>
>>>>
>>>> And reference that in you maven commands via --settings
>>>> path/to/settings.xml. This is useful for creating a quickstart based on
>> the
>>>> staged release and for building against the staged jars.
>>


Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Till Rohrmann <tr...@apache.org>.
Thanks for reporting these problems Chesnay. The usage string in
`standalone-job.sh` is out dated and should be updated. The same applies to
the typo.

When calling `standalone-job.sh start --job-classname foobar.Job` please
make sure that the user code jar is contained in the classpath (e.g.
putting the jar in the lib directory). Documenting this behaviour is part
of the pending issue FLINK-10001.

We should fix all of these issues. They are, however, no release blockers.

Cheers,
Till

On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler <ch...@apache.org> wrote:

> I found some issues with the standalone-job.sh script.
>
> I ran "./bin/standalone-job.sh start" as described by the usage string.
>
>     2018-08-08 09:22:34,385 ERROR
>     org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>     Could not parse command line arguments [--configDir,
>     /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
>     org.apache.flink.runtime.entrypoint.FlinkParseException: Failed to
>     parse the command line arguments.
>              at
>
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
>              at
>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
>     Caused by: org.apache.commons.cli.MissingOptionException: Missing
>     required option: j
>              at
>
> org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
>              at
>     org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
>              at
>     org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
>              at
>
> org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
>             ... 1 more
>
> The script should fail earlier if no jar is provided, with a better
> error message.
> It is also undocumented, and the usage instructions don't appear correct.
>
> Passing a jar with the -j option leads to a ClassNotFoundException:
> "./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"
>
>     2018-08-08 09:26:30,562 ERROR
>     org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
>     Cluster initialization failed.
>     java.lang.reflect.UndeclaredThrowableException
>              at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
>              at
>
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>              at
>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
>              at
>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
>     Caused by: org.apache.flink.util.FlinkException: Could not load the
>     provied entrypoint class.
>              at
>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
>              at
>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
>              at
>
> org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
>              at
>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
>              at
>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
>              at
>
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
>              at java.security.AccessController.doPrivileged(Native Method)
>              at javax.security.auth.Subject.doAs(Subject.java:422)
>              at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>              ... 3 more
>     Caused by: java.lang.ClassNotFoundException:
>     examples/streaming/WordCount.jar
>              at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>              at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>              at
>     sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>              at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>              at
>
> org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
>              ... 11 more
>
> So this seems to not work at all, but maybe I'm using it wrong?
>
> (There's also typo in "Could not load the provied entrypoint class")
>
> On 08.08.2018 10:33, Piotr Nowojski wrote:
> > +1 from my side
> >
> > I’ve spent some time playing around with various examples (batching,
> streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with
> different configuration options (number of task
> managers/memory/Flip6/credit base flow control/metrics) and everything
> looks now fine (after fixing
> https://issues.apache.org/jira/browse/FLINK-9969 <
> https://issues.apache.org/jira/browse/FLINK-9969> ).
> >
> > Piotrek
> >
> >> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org> wrote:
> >>
> >> Hi everyone,
> >> Please review and vote on the release candidate #4 for the version
> 1.6.0,
> >> as follows:
> >> [ ] +1, Approve the release
> >> [ ] -1, Do not approve the release (please provide specific comments)
> >>
> >>
> >> The complete staging area is available for your review, which includes:
> >> * JIRA release notes [1],
> >> * the official Apache source release and binary convenience releases to
> be
> >> deployed to dist.apache.org [2], which are signed with the key with
> >> fingerprint 1F302569A96CFFD5 [3],
> >> * all artifacts to be deployed to the Maven Central Repository [4],
> >> * source code tag "release-1.6.0-rc4" [5],
> >> * website pull request listing the new release and adding announcement
> blog
> >> post [6].
> >>
> >> Please use this document for coordinating testing efforts: [7]
> >>
> >> The vote will be shortened since we only adde a minor fix on top of the
> RC
> >> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
> >> approval, with at least 3 PMC affirmative votes.
> >>
> >> Thanks,
> >> Your friendly Release Manager
> >>
> >> [1]
> >>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
> >> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
> >> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> >> [4]
> https://repository.apache.org/content/repositories/orgapacheflink-1178
> >> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
> >> [6] https://github.com/apache/flink-web/pull/117
> >> [7]
> >>
> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
> >>
> >> Pro-tip: you can create a settings.xml file with these contents:
> >>
> >> <settings>
> >> <activeProfiles>
> >>   <activeProfile>flink-1.6.0</activeProfile>
> >> </activeProfiles>
> >> <profiles>
> >>   <profile>
> >>     <id>flink-1.6.0</id>
> >>     <repositories>
> >>       <repository>
> >>         <id>flink-1.6.0</id>
> >>         <url>
> >>
> >> https://repository.apache.org/content/repositories/orgapacheflink-1178/
> >>         </url>
> >>       </repository>
> >>       <repository>
> >>         <id>archetype</id>
> >>         <url>
> >>
> >> https://repository.apache.org/content/repositories/orgapacheflink-1178/
> >>         </url>
> >>       </repository>
> >>     </repositories>
> >>   </profile>
> >> </profiles>
> >> </settings>
> >>
> >> And reference that in you maven commands via --settings
> >> path/to/settings.xml. This is useful for creating a quickstart based on
> the
> >> staged release and for building against the staged jars.
> >
>
>

Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Chesnay Schepler <ch...@apache.org>.
I found some issues with the standalone-job.sh script.

I ran "./bin/standalone-job.sh start" as described by the usage string.

    2018-08-08 09:22:34,385 ERROR
    org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
    Could not parse command line arguments [--configDir,
    /home/zento/svn/flink-1.6.0/flink-1.6.0/conf].
    org.apache.flink.runtime.entrypoint.FlinkParseException: Failed to
    parse the command line arguments.
             at
    org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52)
             at
    org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:143)
    Caused by: org.apache.commons.cli.MissingOptionException: Missing
    required option: j
             at
    org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199)
             at
    org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130)
             at
    org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81)
             at
    org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50)
            ... 1 more

The script should fail earlier if no jar is provided, with a better 
error message.
It is also undocumented, and the usage instructions don't appear correct.

Passing a jar with the -j option leads to a ClassNotFoundException:
"./bin/standalone-job.sh start -j examples/streaming/WordCount.jar"

    2018-08-08 09:26:30,562 ERROR
    org.apache.flink.runtime.entrypoint.ClusterEntrypoint         -
    Cluster initialization failed.
    java.lang.reflect.UndeclaredThrowableException
             at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)
             at
    org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
             at
    org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:189)
             at
    org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.main(StandaloneJobClusterEntryPoint.java:158)
    Caused by: org.apache.flink.util.FlinkException: Could not load the
    provied entrypoint class.
             at
    org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:92)
             at
    org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.retrieveJobGraph(StandaloneJobClusterEntryPoint.java:75)
             at
    org.apache.flink.runtime.entrypoint.JobClusterEntrypoint.createDispatcher(JobClusterEntrypoint.java:107)
             at
    org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startClusterComponents(ClusterEntrypoint.java:353)
             at
    org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:232)
             at
    org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$0(ClusterEntrypoint.java:190)
             at java.security.AccessController.doPrivileged(Native Method)
             at javax.security.auth.Subject.doAs(Subject.java:422)
             at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
             ... 3 more
    Caused by: java.lang.ClassNotFoundException:
    examples/streaming/WordCount.jar
             at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
             at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
             at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
             at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
             at
    org.apache.flink.container.entrypoint.StandaloneJobClusterEntryPoint.createPackagedProgram(StandaloneJobClusterEntryPoint.java:89)
             ... 11 more

So this seems to not work at all, but maybe I'm using it wrong?

(There's also typo in "Could not load the provied entrypoint class")

On 08.08.2018 10:33, Piotr Nowojski wrote:
> +1 from my side
>
> I’ve spent some time playing around with various examples (batching, streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with different configuration options (number of task managers/memory/Flip6/credit base flow control/metrics) and everything looks now fine (after fixing https://issues.apache.org/jira/browse/FLINK-9969 <https://issues.apache.org/jira/browse/FLINK-9969> ).
>
> Piotrek
>
>> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org> wrote:
>>
>> Hi everyone,
>> Please review and vote on the release candidate #4 for the version 1.6.0,
>> as follows:
>> [ ] +1, Approve the release
>> [ ] -1, Do not approve the release (please provide specific comments)
>>
>>
>> The complete staging area is available for your review, which includes:
>> * JIRA release notes [1],
>> * the official Apache source release and binary convenience releases to be
>> deployed to dist.apache.org [2], which are signed with the key with
>> fingerprint 1F302569A96CFFD5 [3],
>> * all artifacts to be deployed to the Maven Central Repository [4],
>> * source code tag "release-1.6.0-rc4" [5],
>> * website pull request listing the new release and adding announcement blog
>> post [6].
>>
>> Please use this document for coordinating testing efforts: [7]
>>
>> The vote will be shortened since we only adde a minor fix on top of the RC
>> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
>> approval, with at least 3 PMC affirmative votes.
>>
>> Thanks,
>> Your friendly Release Manager
>>
>> [1]
>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>> [4] https://repository.apache.org/content/repositories/orgapacheflink-1178
>> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
>> [6] https://github.com/apache/flink-web/pull/117
>> [7]
>> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
>>
>> Pro-tip: you can create a settings.xml file with these contents:
>>
>> <settings>
>> <activeProfiles>
>>   <activeProfile>flink-1.6.0</activeProfile>
>> </activeProfiles>
>> <profiles>
>>   <profile>
>>     <id>flink-1.6.0</id>
>>     <repositories>
>>       <repository>
>>         <id>flink-1.6.0</id>
>>         <url>
>>
>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>         </url>
>>       </repository>
>>       <repository>
>>         <id>archetype</id>
>>         <url>
>>
>> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>>         </url>
>>       </repository>
>>     </repositories>
>>   </profile>
>> </profiles>
>> </settings>
>>
>> And reference that in you maven commands via --settings
>> path/to/settings.xml. This is useful for creating a quickstart based on the
>> staged release and for building against the staged jars.
>


Re: [VOTE] Release 1.6.0, release candidate #4

Posted by Piotr Nowojski <pi...@data-artisans.com>.
+1 from my side

I’ve spent some time playing around with various examples (batching, streaming and SQL) on EMR 6 nodes cluster with yarn deployment, with different configuration options (number of task managers/memory/Flip6/credit base flow control/metrics) and everything looks now fine (after fixing https://issues.apache.org/jira/browse/FLINK-9969 <https://issues.apache.org/jira/browse/FLINK-9969> ).

Piotrek 

> On 7 Aug 2018, at 17:17, Till Rohrmann <tr...@apache.org> wrote:
> 
> Hi everyone,
> Please review and vote on the release candidate #4 for the version 1.6.0,
> as follows:
> [ ] +1, Approve the release
> [ ] -1, Do not approve the release (please provide specific comments)
> 
> 
> The complete staging area is available for your review, which includes:
> * JIRA release notes [1],
> * the official Apache source release and binary convenience releases to be
> deployed to dist.apache.org [2], which are signed with the key with
> fingerprint 1F302569A96CFFD5 [3],
> * all artifacts to be deployed to the Maven Central Repository [4],
> * source code tag "release-1.6.0-rc4" [5],
> * website pull request listing the new release and adding announcement blog
> post [6].
> 
> Please use this document for coordinating testing efforts: [7]
> 
> The vote will be shortened since we only adde a minor fix on top of the RC
> 3. It will close on Wednesday 6:30pm CET. It is adopted by majority
> approval, with at least 3 PMC affirmative votes.
> 
> Thanks,
> Your friendly Release Manager
> 
> [1]
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.6.0/
> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> [4] https://repository.apache.org/content/repositories/orgapacheflink-1178
> [5] https://github.com/apache/flink/tree/release-1.6.0-rc4
> [6] https://github.com/apache/flink-web/pull/117
> [7]
> https://docs.google.com/document/d/1upBFZQ7tbaSkYvDiLqfUFXKg8Xxs-lVheEfb66e4jpo/edit?usp=sharing
> 
> Pro-tip: you can create a settings.xml file with these contents:
> 
> <settings>
> <activeProfiles>
>  <activeProfile>flink-1.6.0</activeProfile>
> </activeProfiles>
> <profiles>
>  <profile>
>    <id>flink-1.6.0</id>
>    <repositories>
>      <repository>
>        <id>flink-1.6.0</id>
>        <url>
> 
> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>        </url>
>      </repository>
>      <repository>
>        <id>archetype</id>
>        <url>
> 
> https://repository.apache.org/content/repositories/orgapacheflink-1178/
>        </url>
>      </repository>
>    </repositories>
>  </profile>
> </profiles>
> </settings>
> 
> And reference that in you maven commands via --settings
> path/to/settings.xml. This is useful for creating a quickstart based on the
> staged release and for building against the staged jars.