You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by Lokesh Rajaram <ra...@gmail.com> on 2015/05/10 04:49:15 UTC

Hello Everyone

Hello All,

I am new to Flink community and am very excited about the project and work
you all have been doing. Kudos!!

I was looking to pickup some starter task. Robert recommended to pick up
https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert for your
guidance.

Sorry for a dumb question. I am done with code changes but my "mvn verify"
failing only for the scala module as follows

flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
error: ambiguous reference to overloaded definition,
[ERROR] both method checkNotNull in object Preconditions of type [T](x$1:
T, x$2: String, x$3: <repeated...>[Object])T
[ERROR] and  method checkNotNull in object Preconditions of type [T](x$1:
T, x$2: Any)T
[ERROR] match argument types ((L, R) => O,String)
[ERROR]     Preconditions.checkNotNull(fun, "Join function must not be
null.")

Same error I see for all of the Scala classes I changed. Any pointers here
will be very helpful for me to proceed further. Please let me know if you
need more information.

Thanks in advance for your help and support.

Thanks,
Lokesh

Re: Hello Everyone

Posted by Aljoscha Krettek <al...@apache.org>.
We'll look at why it failed and then decide whether it's good. So for
those KafkaITCase fails we no that it doesn't matter right now.

On Fri, May 15, 2015 at 12:56 AM, Lokesh Rajaram
<ra...@gmail.com> wrote:
> This almost worked. Of the 5 build jobs four passed and one failed.
> What's the acceptance criteria for a pull request? Do I need to build again
> to get all 5 build jobs passing?
>
> Thanks,
> Lokesh
>
> On Thu, May 14, 2015 at 8:50 AM, Robert Metzger <rm...@apache.org> wrote:
>
>> No, you don't have to wait.
>> The KafkaITCase is not always failing. If you're lucky, it will pass with
>> the next run.
>>
>> On Thu, May 14, 2015 at 5:48 PM, Lokesh Rajaram <ra...@gmail.com>
>> wrote:
>>
>> > If I understand it correct, I have to wait for your pull request to be
>> > merged, I can rebase and trigger build again. is that right?
>> >
>> > Thanks Robert, Aljoscha for super fast reply/help.
>> >
>> > Thanks,
>> > Lokesh
>> >
>> > On Thu, May 14, 2015 at 8:39 AM, Robert Metzger <rm...@apache.org>
>> > wrote:
>> >
>> > > However, you can only restart runs in your travis account, not on the
>> > > apache account (also used for validating pull requests).
>> > >
>> > > I have opened a pull request a few minutes ago which will reduce the
>> > number
>> > > of KafakITCase failures (there is still one other unresolved issue).
>> > >
>> > > On Thu, May 14, 2015 at 5:37 PM, Aljoscha Krettek <aljoscha@apache.org
>> >
>> > > wrote:
>> > >
>> > > > Hi,
>> > > > don't worry, there are very few stupid questions. :D
>> > > >
>> > > > The KafkaITCase sometimes fails on Travis, this is a known problem
>> > > > currently. On travis you can restart the individual runs for a commit
>> > > > in the view of the failed run.
>> > > >
>> > > > Hope that helps.
>> > > >
>> > > > Cheers,
>> > > > Aljoscha
>> > > >
>> > > > On Thu, May 14, 2015 at 5:35 PM, Lokesh Rajaram
>> > > > <ra...@gmail.com> wrote:
>> > > > > Thanks Aljoscha, Robert. After adding guava dependency for
>> > > flink-spargel
>> > > > I
>> > > > > was able to progress further but now it's failing in
>> > > > > flink-streaming-connectors for the following test case:
>> > > > >
>> > > > > KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed with:
>> > Job
>> > > > > execution failed.
>> > > > >
>> > > > > Any pointers would help me proceed further. Sorry for a lot of
>> > trivial
>> > > > > questions, I am just getting started not familiar with the code
>> base.
>> > > > > I tried running locally, I am able to run it successfully, don't
>> know
>> > > why
>> > > > > it's only failing in Travis build. Not sure if I am missing
>> something
>> > > in
>> > > > my
>> > > > > local environment.
>> > > > >
>> > > > > Thanks,
>> > > > > Lokesh
>> > > > >
>> > > > > On Thu, May 14, 2015 at 1:39 AM, Robert Metzger <
>> rmetzger@apache.org
>> > >
>> > > > wrote:
>> > > > >
>> > > > >> I think flink-spargel is missing the guava dependency.
>> > > > >>
>> > > > >> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <
>> > > aljoscha@apache.org>
>> > > > >> wrote:
>> > > > >>
>> > > > >> > @Robert, this seems like a problem with the Shading?
>> > > > >> >
>> > > > >> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
>> > > > >> > <ra...@gmail.com> wrote:
>> > > > >> > > Thanks Aljioscha. I was able to change as recommended and able
>> > to
>> > > > run
>> > > > >> the
>> > > > >> > > entire test suite in local successfully.
>> > > > >> > > However Travis build is failing for pull request:
>> > > > >> > > https://github.com/apache/flink/pull/673.
>> > > > >> > >
>> > > > >> > > It's a compilation failure:
>> > > > >> > >
>> > > > >> > > [ERROR] Failed to execute goal
>> > > > >> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
>> > > > >> > > (default-compile) on project flink-spargel: Compilation
>> failure:
>> > > > >> > > Compilation failure:
>> > > > >> > > [ERROR]
>> > > > >> > >
>> > > > >> >
>> > > > >>
>> > > >
>> > >
>> >
>> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
>> > > > >> > > package com.google.common.base does not exist
>> > > > >> > >
>> > > > >> > > I can definitely see the package imported in the class,
>> > compiling
>> > > > and
>> > > > >> > > passing all tests in local.
>> > > > >> > > Anything I am missing here?
>> > > > >> > >
>> > > > >> > > Thanks,
>> > > > >> > > Lokesh
>> > > > >> > >
>> > > > >> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <
>> > > > aljoscha@apache.org
>> > > > >> >
>> > > > >> > > wrote:
>> > > > >> > >
>> > > > >> > >> I think you can replace Validate.NotNull(p) with require(p !=
>> > > > null, "p
>> > > > >> > >> is null (or something like this)").
>> > > > >> > >>
>> > > > >> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
>> > > > >> > >> <ra...@gmail.com> wrote:
>> > > > >> > >> > 1. I think I can use require for replacing Validate.isTrue
>> > > > >> > >> > 2. What about Validate.notNull? If require is used it would
>> > > throw
>> > > > >> > >> > IllegalArgumentException,
>> > > > >> > >> > if assume or assert is used it would throw AssertionError
>> > which
>> > > > is
>> > > > >> not
>> > > > >> > >> > compatible with current implementation.
>> > > > >> > >> >
>> > > > >> > >> > Please let me know if my understanding is correct. Also,
>> let
>> > me
>> > > > know
>> > > > >> > your
>> > > > >> > >> > thoughts.
>> > > > >> > >> >
>> > > > >> > >> > Thanks,
>> > > > >> > >> > Lokesh
>> > > > >> > >> >
>> > > > >> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
>> > > > >> > aljoscha@apache.org>
>> > > > >> > >> > wrote:
>> > > > >> > >> >
>> > > > >> > >> >> I would propose using the methods as Chiwan suggested. If
>> > > > everyone
>> > > > >> > >> >> agrees I can change the Jira issue.
>> > > > >> > >> >>
>> > > > >> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
>> > > > >> > >> >> <ra...@gmail.com> wrote:
>> > > > >> > >> >> > Thank you for the reference links. Which approach
>> should I
>> > > > take,
>> > > > >> > >> casting
>> > > > >> > >> >> or
>> > > > >> > >> >> > use scala methods.
>> > > > >> > >> >> > If it's the latter option will the JIRA ticket
>> FLINK-1711
>> > > > >> > >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be
>> > > > updated to
>> > > > >> > >> >> reflect it?
>> > > > >> > >> >> >
>> > > > >> > >> >> > Thanks,
>> > > > >> > >> >> > Lokesh
>> > > > >> > >> >> >
>> > > > >> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <
>> > > > >> chiwanpark@icloud.com
>> > > > >> > >
>> > > > >> > >> >> wrote:
>> > > > >> > >> >> >
>> > > > >> > >> >> >> Hi. There is some problems using Guava’s check method
>> in
>> > > > Scala.
>> > > > >> (
>> > > > >> > >> >> >>
>> > > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
>> > > > >> > <
>> > > > >> > >> >> >>
>> > > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
>> > > > >> > >)
>> > > > >> > >> You
>> > > > >> > >> >> >> can solve this error simply with casting last argument
>> to
>> > > > >> > >> >> java.lang.Object.
>> > > > >> > >> >> >> But I think we’d better use `require`, `assume`,
>> `assert`
>> > > > method
>> > > > >> > >> >> provided
>> > > > >> > >> >> >> by Scala. (
>> > > > >> > >> >> >>
>> > > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
>> > > > >> > <
>> > > > >> > >> >> >>
>> > > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
>> > > > >> > >)
>> > > > >> > >> >> >> Because this changes affects many other codes, so we
>> > should
>> > > > >> > discuss
>> > > > >> > >> >> about
>> > > > >> > >> >> >> changing Guava's method to Scala’s method.
>> > > > >> > >> >> >>
>> > > > >> > >> >> >> Regards.
>> > > > >> > >> >> >> Chiwan Park (Sent with iPhone)
>> > > > >> > >> >> >>
>> > > > >> > >> >> >>
>> > > > >> > >> >> >>
>> > > > >> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
>> > > > >> > >> >> rajaram.lokesh@gmail.com>
>> > > > >> > >> >> >> wrote:
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> > Hello All,
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> > I am new to Flink community and am very excited about
>> > the
>> > > > >> > project
>> > > > >> > >> and
>> > > > >> > >> >> >> work
>> > > > >> > >> >> >> > you all have been doing. Kudos!!
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> > I was looking to pickup some starter task. Robert
>> > > > recommended
>> > > > >> to
>> > > > >> > >> pick
>> > > > >> > >> >> up
>> > > > >> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711.
>> > Thanks
>> > > > >> Robert
>> > > > >> > >> for
>> > > > >> > >> >> your
>> > > > >> > >> >> >> > guidance.
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> > Sorry for a dumb question. I am done with code
>> changes
>> > > but
>> > > > my
>> > > > >> > "mvn
>> > > > >> > >> >> >> verify"
>> > > > >> > >> >> >> > failing only for the scala module as follows
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >>
>> > > > >> > >> >>
>> > > > >> > >>
>> > > > >> >
>> > > > >>
>> > > >
>> > >
>> >
>> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
>> > > > >> > >> >> >> > error: ambiguous reference to overloaded definition,
>> > > > >> > >> >> >> > [ERROR] both method checkNotNull in object
>> > Preconditions
>> > > of
>> > > > >> type
>> > > > >> > >> >> [T](x$1:
>> > > > >> > >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
>> > > > >> > >> >> >> > [ERROR] and  method checkNotNull in object
>> > Preconditions
>> > > of
>> > > > >> type
>> > > > >> > >> >> [T](x$1:
>> > > > >> > >> >> >> > T, x$2: Any)T
>> > > > >> > >> >> >> > [ERROR] match argument types ((L, R) => O,String)
>> > > > >> > >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join
>> > > function
>> > > > >> must
>> > > > >> > >> not be
>> > > > >> > >> >> >> > null.")
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> > Same error I see for all of the Scala classes I
>> > changed.
>> > > > Any
>> > > > >> > >> pointers
>> > > > >> > >> >> >> here
>> > > > >> > >> >> >> > will be very helpful for me to proceed further.
>> Please
>> > > let
>> > > > me
>> > > > >> > know
>> > > > >> > >> if
>> > > > >> > >> >> you
>> > > > >> > >> >> >> > need more information.
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> > Thanks in advance for your help and support.
>> > > > >> > >> >> >> >
>> > > > >> > >> >> >> > Thanks,
>> > > > >> > >> >> >> > Lokesh
>> > > > >> > >> >> >>
>> > > > >> > >> >> >>
>> > > > >> > >> >>
>> > > > >> > >>
>> > > > >> >
>> > > > >>
>> > > >
>> > >
>> >
>>

Re: Hello Everyone

Posted by Lokesh Rajaram <ra...@gmail.com>.
This almost worked. Of the 5 build jobs four passed and one failed.
What's the acceptance criteria for a pull request? Do I need to build again
to get all 5 build jobs passing?

Thanks,
Lokesh

On Thu, May 14, 2015 at 8:50 AM, Robert Metzger <rm...@apache.org> wrote:

> No, you don't have to wait.
> The KafkaITCase is not always failing. If you're lucky, it will pass with
> the next run.
>
> On Thu, May 14, 2015 at 5:48 PM, Lokesh Rajaram <ra...@gmail.com>
> wrote:
>
> > If I understand it correct, I have to wait for your pull request to be
> > merged, I can rebase and trigger build again. is that right?
> >
> > Thanks Robert, Aljoscha for super fast reply/help.
> >
> > Thanks,
> > Lokesh
> >
> > On Thu, May 14, 2015 at 8:39 AM, Robert Metzger <rm...@apache.org>
> > wrote:
> >
> > > However, you can only restart runs in your travis account, not on the
> > > apache account (also used for validating pull requests).
> > >
> > > I have opened a pull request a few minutes ago which will reduce the
> > number
> > > of KafakITCase failures (there is still one other unresolved issue).
> > >
> > > On Thu, May 14, 2015 at 5:37 PM, Aljoscha Krettek <aljoscha@apache.org
> >
> > > wrote:
> > >
> > > > Hi,
> > > > don't worry, there are very few stupid questions. :D
> > > >
> > > > The KafkaITCase sometimes fails on Travis, this is a known problem
> > > > currently. On travis you can restart the individual runs for a commit
> > > > in the view of the failed run.
> > > >
> > > > Hope that helps.
> > > >
> > > > Cheers,
> > > > Aljoscha
> > > >
> > > > On Thu, May 14, 2015 at 5:35 PM, Lokesh Rajaram
> > > > <ra...@gmail.com> wrote:
> > > > > Thanks Aljoscha, Robert. After adding guava dependency for
> > > flink-spargel
> > > > I
> > > > > was able to progress further but now it's failing in
> > > > > flink-streaming-connectors for the following test case:
> > > > >
> > > > > KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed with:
> > Job
> > > > > execution failed.
> > > > >
> > > > > Any pointers would help me proceed further. Sorry for a lot of
> > trivial
> > > > > questions, I am just getting started not familiar with the code
> base.
> > > > > I tried running locally, I am able to run it successfully, don't
> know
> > > why
> > > > > it's only failing in Travis build. Not sure if I am missing
> something
> > > in
> > > > my
> > > > > local environment.
> > > > >
> > > > > Thanks,
> > > > > Lokesh
> > > > >
> > > > > On Thu, May 14, 2015 at 1:39 AM, Robert Metzger <
> rmetzger@apache.org
> > >
> > > > wrote:
> > > > >
> > > > >> I think flink-spargel is missing the guava dependency.
> > > > >>
> > > > >> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <
> > > aljoscha@apache.org>
> > > > >> wrote:
> > > > >>
> > > > >> > @Robert, this seems like a problem with the Shading?
> > > > >> >
> > > > >> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
> > > > >> > <ra...@gmail.com> wrote:
> > > > >> > > Thanks Aljioscha. I was able to change as recommended and able
> > to
> > > > run
> > > > >> the
> > > > >> > > entire test suite in local successfully.
> > > > >> > > However Travis build is failing for pull request:
> > > > >> > > https://github.com/apache/flink/pull/673.
> > > > >> > >
> > > > >> > > It's a compilation failure:
> > > > >> > >
> > > > >> > > [ERROR] Failed to execute goal
> > > > >> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> > > > >> > > (default-compile) on project flink-spargel: Compilation
> failure:
> > > > >> > > Compilation failure:
> > > > >> > > [ERROR]
> > > > >> > >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> > > > >> > > package com.google.common.base does not exist
> > > > >> > >
> > > > >> > > I can definitely see the package imported in the class,
> > compiling
> > > > and
> > > > >> > > passing all tests in local.
> > > > >> > > Anything I am missing here?
> > > > >> > >
> > > > >> > > Thanks,
> > > > >> > > Lokesh
> > > > >> > >
> > > > >> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <
> > > > aljoscha@apache.org
> > > > >> >
> > > > >> > > wrote:
> > > > >> > >
> > > > >> > >> I think you can replace Validate.NotNull(p) with require(p !=
> > > > null, "p
> > > > >> > >> is null (or something like this)").
> > > > >> > >>
> > > > >> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> > > > >> > >> <ra...@gmail.com> wrote:
> > > > >> > >> > 1. I think I can use require for replacing Validate.isTrue
> > > > >> > >> > 2. What about Validate.notNull? If require is used it would
> > > throw
> > > > >> > >> > IllegalArgumentException,
> > > > >> > >> > if assume or assert is used it would throw AssertionError
> > which
> > > > is
> > > > >> not
> > > > >> > >> > compatible with current implementation.
> > > > >> > >> >
> > > > >> > >> > Please let me know if my understanding is correct. Also,
> let
> > me
> > > > know
> > > > >> > your
> > > > >> > >> > thoughts.
> > > > >> > >> >
> > > > >> > >> > Thanks,
> > > > >> > >> > Lokesh
> > > > >> > >> >
> > > > >> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
> > > > >> > aljoscha@apache.org>
> > > > >> > >> > wrote:
> > > > >> > >> >
> > > > >> > >> >> I would propose using the methods as Chiwan suggested. If
> > > > everyone
> > > > >> > >> >> agrees I can change the Jira issue.
> > > > >> > >> >>
> > > > >> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> > > > >> > >> >> <ra...@gmail.com> wrote:
> > > > >> > >> >> > Thank you for the reference links. Which approach
> should I
> > > > take,
> > > > >> > >> casting
> > > > >> > >> >> or
> > > > >> > >> >> > use scala methods.
> > > > >> > >> >> > If it's the latter option will the JIRA ticket
> FLINK-1711
> > > > >> > >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be
> > > > updated to
> > > > >> > >> >> reflect it?
> > > > >> > >> >> >
> > > > >> > >> >> > Thanks,
> > > > >> > >> >> > Lokesh
> > > > >> > >> >> >
> > > > >> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <
> > > > >> chiwanpark@icloud.com
> > > > >> > >
> > > > >> > >> >> wrote:
> > > > >> > >> >> >
> > > > >> > >> >> >> Hi. There is some problems using Guava’s check method
> in
> > > > Scala.
> > > > >> (
> > > > >> > >> >> >>
> > > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > > > >> > <
> > > > >> > >> >> >>
> > > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > > > >> > >)
> > > > >> > >> You
> > > > >> > >> >> >> can solve this error simply with casting last argument
> to
> > > > >> > >> >> java.lang.Object.
> > > > >> > >> >> >> But I think we’d better use `require`, `assume`,
> `assert`
> > > > method
> > > > >> > >> >> provided
> > > > >> > >> >> >> by Scala. (
> > > > >> > >> >> >>
> > > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > > > >> > <
> > > > >> > >> >> >>
> > > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > > > >> > >)
> > > > >> > >> >> >> Because this changes affects many other codes, so we
> > should
> > > > >> > discuss
> > > > >> > >> >> about
> > > > >> > >> >> >> changing Guava's method to Scala’s method.
> > > > >> > >> >> >>
> > > > >> > >> >> >> Regards.
> > > > >> > >> >> >> Chiwan Park (Sent with iPhone)
> > > > >> > >> >> >>
> > > > >> > >> >> >>
> > > > >> > >> >> >>
> > > > >> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> > > > >> > >> >> rajaram.lokesh@gmail.com>
> > > > >> > >> >> >> wrote:
> > > > >> > >> >> >> >
> > > > >> > >> >> >> > Hello All,
> > > > >> > >> >> >> >
> > > > >> > >> >> >> > I am new to Flink community and am very excited about
> > the
> > > > >> > project
> > > > >> > >> and
> > > > >> > >> >> >> work
> > > > >> > >> >> >> > you all have been doing. Kudos!!
> > > > >> > >> >> >> >
> > > > >> > >> >> >> > I was looking to pickup some starter task. Robert
> > > > recommended
> > > > >> to
> > > > >> > >> pick
> > > > >> > >> >> up
> > > > >> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711.
> > Thanks
> > > > >> Robert
> > > > >> > >> for
> > > > >> > >> >> your
> > > > >> > >> >> >> > guidance.
> > > > >> > >> >> >> >
> > > > >> > >> >> >> > Sorry for a dumb question. I am done with code
> changes
> > > but
> > > > my
> > > > >> > "mvn
> > > > >> > >> >> >> verify"
> > > > >> > >> >> >> > failing only for the scala module as follows
> > > > >> > >> >> >> >
> > > > >> > >> >> >> >
> > > > >> > >> >> >>
> > > > >> > >> >>
> > > > >> > >>
> > > > >> >
> > > > >>
> > > >
> > >
> >
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> > > > >> > >> >> >> > error: ambiguous reference to overloaded definition,
> > > > >> > >> >> >> > [ERROR] both method checkNotNull in object
> > Preconditions
> > > of
> > > > >> type
> > > > >> > >> >> [T](x$1:
> > > > >> > >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> > > > >> > >> >> >> > [ERROR] and  method checkNotNull in object
> > Preconditions
> > > of
> > > > >> type
> > > > >> > >> >> [T](x$1:
> > > > >> > >> >> >> > T, x$2: Any)T
> > > > >> > >> >> >> > [ERROR] match argument types ((L, R) => O,String)
> > > > >> > >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join
> > > function
> > > > >> must
> > > > >> > >> not be
> > > > >> > >> >> >> > null.")
> > > > >> > >> >> >> >
> > > > >> > >> >> >> > Same error I see for all of the Scala classes I
> > changed.
> > > > Any
> > > > >> > >> pointers
> > > > >> > >> >> >> here
> > > > >> > >> >> >> > will be very helpful for me to proceed further.
> Please
> > > let
> > > > me
> > > > >> > know
> > > > >> > >> if
> > > > >> > >> >> you
> > > > >> > >> >> >> > need more information.
> > > > >> > >> >> >> >
> > > > >> > >> >> >> > Thanks in advance for your help and support.
> > > > >> > >> >> >> >
> > > > >> > >> >> >> > Thanks,
> > > > >> > >> >> >> > Lokesh
> > > > >> > >> >> >>
> > > > >> > >> >> >>
> > > > >> > >> >>
> > > > >> > >>
> > > > >> >
> > > > >>
> > > >
> > >
> >
>

Re: Hello Everyone

Posted by Robert Metzger <rm...@apache.org>.
No, you don't have to wait.
The KafkaITCase is not always failing. If you're lucky, it will pass with
the next run.

On Thu, May 14, 2015 at 5:48 PM, Lokesh Rajaram <ra...@gmail.com>
wrote:

> If I understand it correct, I have to wait for your pull request to be
> merged, I can rebase and trigger build again. is that right?
>
> Thanks Robert, Aljoscha for super fast reply/help.
>
> Thanks,
> Lokesh
>
> On Thu, May 14, 2015 at 8:39 AM, Robert Metzger <rm...@apache.org>
> wrote:
>
> > However, you can only restart runs in your travis account, not on the
> > apache account (also used for validating pull requests).
> >
> > I have opened a pull request a few minutes ago which will reduce the
> number
> > of KafakITCase failures (there is still one other unresolved issue).
> >
> > On Thu, May 14, 2015 at 5:37 PM, Aljoscha Krettek <al...@apache.org>
> > wrote:
> >
> > > Hi,
> > > don't worry, there are very few stupid questions. :D
> > >
> > > The KafkaITCase sometimes fails on Travis, this is a known problem
> > > currently. On travis you can restart the individual runs for a commit
> > > in the view of the failed run.
> > >
> > > Hope that helps.
> > >
> > > Cheers,
> > > Aljoscha
> > >
> > > On Thu, May 14, 2015 at 5:35 PM, Lokesh Rajaram
> > > <ra...@gmail.com> wrote:
> > > > Thanks Aljoscha, Robert. After adding guava dependency for
> > flink-spargel
> > > I
> > > > was able to progress further but now it's failing in
> > > > flink-streaming-connectors for the following test case:
> > > >
> > > > KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed with:
> Job
> > > > execution failed.
> > > >
> > > > Any pointers would help me proceed further. Sorry for a lot of
> trivial
> > > > questions, I am just getting started not familiar with the code base.
> > > > I tried running locally, I am able to run it successfully, don't know
> > why
> > > > it's only failing in Travis build. Not sure if I am missing something
> > in
> > > my
> > > > local environment.
> > > >
> > > > Thanks,
> > > > Lokesh
> > > >
> > > > On Thu, May 14, 2015 at 1:39 AM, Robert Metzger <rmetzger@apache.org
> >
> > > wrote:
> > > >
> > > >> I think flink-spargel is missing the guava dependency.
> > > >>
> > > >> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <
> > aljoscha@apache.org>
> > > >> wrote:
> > > >>
> > > >> > @Robert, this seems like a problem with the Shading?
> > > >> >
> > > >> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
> > > >> > <ra...@gmail.com> wrote:
> > > >> > > Thanks Aljioscha. I was able to change as recommended and able
> to
> > > run
> > > >> the
> > > >> > > entire test suite in local successfully.
> > > >> > > However Travis build is failing for pull request:
> > > >> > > https://github.com/apache/flink/pull/673.
> > > >> > >
> > > >> > > It's a compilation failure:
> > > >> > >
> > > >> > > [ERROR] Failed to execute goal
> > > >> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> > > >> > > (default-compile) on project flink-spargel: Compilation failure:
> > > >> > > Compilation failure:
> > > >> > > [ERROR]
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> > > >> > > package com.google.common.base does not exist
> > > >> > >
> > > >> > > I can definitely see the package imported in the class,
> compiling
> > > and
> > > >> > > passing all tests in local.
> > > >> > > Anything I am missing here?
> > > >> > >
> > > >> > > Thanks,
> > > >> > > Lokesh
> > > >> > >
> > > >> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <
> > > aljoscha@apache.org
> > > >> >
> > > >> > > wrote:
> > > >> > >
> > > >> > >> I think you can replace Validate.NotNull(p) with require(p !=
> > > null, "p
> > > >> > >> is null (or something like this)").
> > > >> > >>
> > > >> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> > > >> > >> <ra...@gmail.com> wrote:
> > > >> > >> > 1. I think I can use require for replacing Validate.isTrue
> > > >> > >> > 2. What about Validate.notNull? If require is used it would
> > throw
> > > >> > >> > IllegalArgumentException,
> > > >> > >> > if assume or assert is used it would throw AssertionError
> which
> > > is
> > > >> not
> > > >> > >> > compatible with current implementation.
> > > >> > >> >
> > > >> > >> > Please let me know if my understanding is correct. Also, let
> me
> > > know
> > > >> > your
> > > >> > >> > thoughts.
> > > >> > >> >
> > > >> > >> > Thanks,
> > > >> > >> > Lokesh
> > > >> > >> >
> > > >> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
> > > >> > aljoscha@apache.org>
> > > >> > >> > wrote:
> > > >> > >> >
> > > >> > >> >> I would propose using the methods as Chiwan suggested. If
> > > everyone
> > > >> > >> >> agrees I can change the Jira issue.
> > > >> > >> >>
> > > >> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> > > >> > >> >> <ra...@gmail.com> wrote:
> > > >> > >> >> > Thank you for the reference links. Which approach should I
> > > take,
> > > >> > >> casting
> > > >> > >> >> or
> > > >> > >> >> > use scala methods.
> > > >> > >> >> > If it's the latter option will the JIRA ticket FLINK-1711
> > > >> > >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be
> > > updated to
> > > >> > >> >> reflect it?
> > > >> > >> >> >
> > > >> > >> >> > Thanks,
> > > >> > >> >> > Lokesh
> > > >> > >> >> >
> > > >> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <
> > > >> chiwanpark@icloud.com
> > > >> > >
> > > >> > >> >> wrote:
> > > >> > >> >> >
> > > >> > >> >> >> Hi. There is some problems using Guava’s check method in
> > > Scala.
> > > >> (
> > > >> > >> >> >>
> > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > > >> > <
> > > >> > >> >> >>
> > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > > >> > >)
> > > >> > >> You
> > > >> > >> >> >> can solve this error simply with casting last argument to
> > > >> > >> >> java.lang.Object.
> > > >> > >> >> >> But I think we’d better use `require`, `assume`, `assert`
> > > method
> > > >> > >> >> provided
> > > >> > >> >> >> by Scala. (
> > > >> > >> >> >>
> > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > > >> > <
> > > >> > >> >> >>
> > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > > >> > >)
> > > >> > >> >> >> Because this changes affects many other codes, so we
> should
> > > >> > discuss
> > > >> > >> >> about
> > > >> > >> >> >> changing Guava's method to Scala’s method.
> > > >> > >> >> >>
> > > >> > >> >> >> Regards.
> > > >> > >> >> >> Chiwan Park (Sent with iPhone)
> > > >> > >> >> >>
> > > >> > >> >> >>
> > > >> > >> >> >>
> > > >> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> > > >> > >> >> rajaram.lokesh@gmail.com>
> > > >> > >> >> >> wrote:
> > > >> > >> >> >> >
> > > >> > >> >> >> > Hello All,
> > > >> > >> >> >> >
> > > >> > >> >> >> > I am new to Flink community and am very excited about
> the
> > > >> > project
> > > >> > >> and
> > > >> > >> >> >> work
> > > >> > >> >> >> > you all have been doing. Kudos!!
> > > >> > >> >> >> >
> > > >> > >> >> >> > I was looking to pickup some starter task. Robert
> > > recommended
> > > >> to
> > > >> > >> pick
> > > >> > >> >> up
> > > >> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711.
> Thanks
> > > >> Robert
> > > >> > >> for
> > > >> > >> >> your
> > > >> > >> >> >> > guidance.
> > > >> > >> >> >> >
> > > >> > >> >> >> > Sorry for a dumb question. I am done with code changes
> > but
> > > my
> > > >> > "mvn
> > > >> > >> >> >> verify"
> > > >> > >> >> >> > failing only for the scala module as follows
> > > >> > >> >> >> >
> > > >> > >> >> >> >
> > > >> > >> >> >>
> > > >> > >> >>
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> > > >> > >> >> >> > error: ambiguous reference to overloaded definition,
> > > >> > >> >> >> > [ERROR] both method checkNotNull in object
> Preconditions
> > of
> > > >> type
> > > >> > >> >> [T](x$1:
> > > >> > >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> > > >> > >> >> >> > [ERROR] and  method checkNotNull in object
> Preconditions
> > of
> > > >> type
> > > >> > >> >> [T](x$1:
> > > >> > >> >> >> > T, x$2: Any)T
> > > >> > >> >> >> > [ERROR] match argument types ((L, R) => O,String)
> > > >> > >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join
> > function
> > > >> must
> > > >> > >> not be
> > > >> > >> >> >> > null.")
> > > >> > >> >> >> >
> > > >> > >> >> >> > Same error I see for all of the Scala classes I
> changed.
> > > Any
> > > >> > >> pointers
> > > >> > >> >> >> here
> > > >> > >> >> >> > will be very helpful for me to proceed further. Please
> > let
> > > me
> > > >> > know
> > > >> > >> if
> > > >> > >> >> you
> > > >> > >> >> >> > need more information.
> > > >> > >> >> >> >
> > > >> > >> >> >> > Thanks in advance for your help and support.
> > > >> > >> >> >> >
> > > >> > >> >> >> > Thanks,
> > > >> > >> >> >> > Lokesh
> > > >> > >> >> >>
> > > >> > >> >> >>
> > > >> > >> >>
> > > >> > >>
> > > >> >
> > > >>
> > >
> >
>

Re: Hello Everyone

Posted by Lokesh Rajaram <ra...@gmail.com>.
If I understand it correct, I have to wait for your pull request to be
merged, I can rebase and trigger build again. is that right?

Thanks Robert, Aljoscha for super fast reply/help.

Thanks,
Lokesh

On Thu, May 14, 2015 at 8:39 AM, Robert Metzger <rm...@apache.org> wrote:

> However, you can only restart runs in your travis account, not on the
> apache account (also used for validating pull requests).
>
> I have opened a pull request a few minutes ago which will reduce the number
> of KafakITCase failures (there is still one other unresolved issue).
>
> On Thu, May 14, 2015 at 5:37 PM, Aljoscha Krettek <al...@apache.org>
> wrote:
>
> > Hi,
> > don't worry, there are very few stupid questions. :D
> >
> > The KafkaITCase sometimes fails on Travis, this is a known problem
> > currently. On travis you can restart the individual runs for a commit
> > in the view of the failed run.
> >
> > Hope that helps.
> >
> > Cheers,
> > Aljoscha
> >
> > On Thu, May 14, 2015 at 5:35 PM, Lokesh Rajaram
> > <ra...@gmail.com> wrote:
> > > Thanks Aljoscha, Robert. After adding guava dependency for
> flink-spargel
> > I
> > > was able to progress further but now it's failing in
> > > flink-streaming-connectors for the following test case:
> > >
> > > KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed with: Job
> > > execution failed.
> > >
> > > Any pointers would help me proceed further. Sorry for a lot of trivial
> > > questions, I am just getting started not familiar with the code base.
> > > I tried running locally, I am able to run it successfully, don't know
> why
> > > it's only failing in Travis build. Not sure if I am missing something
> in
> > my
> > > local environment.
> > >
> > > Thanks,
> > > Lokesh
> > >
> > > On Thu, May 14, 2015 at 1:39 AM, Robert Metzger <rm...@apache.org>
> > wrote:
> > >
> > >> I think flink-spargel is missing the guava dependency.
> > >>
> > >> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <
> aljoscha@apache.org>
> > >> wrote:
> > >>
> > >> > @Robert, this seems like a problem with the Shading?
> > >> >
> > >> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
> > >> > <ra...@gmail.com> wrote:
> > >> > > Thanks Aljioscha. I was able to change as recommended and able to
> > run
> > >> the
> > >> > > entire test suite in local successfully.
> > >> > > However Travis build is failing for pull request:
> > >> > > https://github.com/apache/flink/pull/673.
> > >> > >
> > >> > > It's a compilation failure:
> > >> > >
> > >> > > [ERROR] Failed to execute goal
> > >> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> > >> > > (default-compile) on project flink-spargel: Compilation failure:
> > >> > > Compilation failure:
> > >> > > [ERROR]
> > >> > >
> > >> >
> > >>
> >
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> > >> > > package com.google.common.base does not exist
> > >> > >
> > >> > > I can definitely see the package imported in the class, compiling
> > and
> > >> > > passing all tests in local.
> > >> > > Anything I am missing here?
> > >> > >
> > >> > > Thanks,
> > >> > > Lokesh
> > >> > >
> > >> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <
> > aljoscha@apache.org
> > >> >
> > >> > > wrote:
> > >> > >
> > >> > >> I think you can replace Validate.NotNull(p) with require(p !=
> > null, "p
> > >> > >> is null (or something like this)").
> > >> > >>
> > >> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> > >> > >> <ra...@gmail.com> wrote:
> > >> > >> > 1. I think I can use require for replacing Validate.isTrue
> > >> > >> > 2. What about Validate.notNull? If require is used it would
> throw
> > >> > >> > IllegalArgumentException,
> > >> > >> > if assume or assert is used it would throw AssertionError which
> > is
> > >> not
> > >> > >> > compatible with current implementation.
> > >> > >> >
> > >> > >> > Please let me know if my understanding is correct. Also, let me
> > know
> > >> > your
> > >> > >> > thoughts.
> > >> > >> >
> > >> > >> > Thanks,
> > >> > >> > Lokesh
> > >> > >> >
> > >> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
> > >> > aljoscha@apache.org>
> > >> > >> > wrote:
> > >> > >> >
> > >> > >> >> I would propose using the methods as Chiwan suggested. If
> > everyone
> > >> > >> >> agrees I can change the Jira issue.
> > >> > >> >>
> > >> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> > >> > >> >> <ra...@gmail.com> wrote:
> > >> > >> >> > Thank you for the reference links. Which approach should I
> > take,
> > >> > >> casting
> > >> > >> >> or
> > >> > >> >> > use scala methods.
> > >> > >> >> > If it's the latter option will the JIRA ticket FLINK-1711
> > >> > >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be
> > updated to
> > >> > >> >> reflect it?
> > >> > >> >> >
> > >> > >> >> > Thanks,
> > >> > >> >> > Lokesh
> > >> > >> >> >
> > >> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <
> > >> chiwanpark@icloud.com
> > >> > >
> > >> > >> >> wrote:
> > >> > >> >> >
> > >> > >> >> >> Hi. There is some problems using Guava’s check method in
> > Scala.
> > >> (
> > >> > >> >> >>
> > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > >> > <
> > >> > >> >> >>
> > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > >> > >)
> > >> > >> You
> > >> > >> >> >> can solve this error simply with casting last argument to
> > >> > >> >> java.lang.Object.
> > >> > >> >> >> But I think we’d better use `require`, `assume`, `assert`
> > method
> > >> > >> >> provided
> > >> > >> >> >> by Scala. (
> > >> > >> >> >>
> > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > >> > <
> > >> > >> >> >>
> > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > >> > >)
> > >> > >> >> >> Because this changes affects many other codes, so we should
> > >> > discuss
> > >> > >> >> about
> > >> > >> >> >> changing Guava's method to Scala’s method.
> > >> > >> >> >>
> > >> > >> >> >> Regards.
> > >> > >> >> >> Chiwan Park (Sent with iPhone)
> > >> > >> >> >>
> > >> > >> >> >>
> > >> > >> >> >>
> > >> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> > >> > >> >> rajaram.lokesh@gmail.com>
> > >> > >> >> >> wrote:
> > >> > >> >> >> >
> > >> > >> >> >> > Hello All,
> > >> > >> >> >> >
> > >> > >> >> >> > I am new to Flink community and am very excited about the
> > >> > project
> > >> > >> and
> > >> > >> >> >> work
> > >> > >> >> >> > you all have been doing. Kudos!!
> > >> > >> >> >> >
> > >> > >> >> >> > I was looking to pickup some starter task. Robert
> > recommended
> > >> to
> > >> > >> pick
> > >> > >> >> up
> > >> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks
> > >> Robert
> > >> > >> for
> > >> > >> >> your
> > >> > >> >> >> > guidance.
> > >> > >> >> >> >
> > >> > >> >> >> > Sorry for a dumb question. I am done with code changes
> but
> > my
> > >> > "mvn
> > >> > >> >> >> verify"
> > >> > >> >> >> > failing only for the scala module as follows
> > >> > >> >> >> >
> > >> > >> >> >> >
> > >> > >> >> >>
> > >> > >> >>
> > >> > >>
> > >> >
> > >>
> >
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> > >> > >> >> >> > error: ambiguous reference to overloaded definition,
> > >> > >> >> >> > [ERROR] both method checkNotNull in object Preconditions
> of
> > >> type
> > >> > >> >> [T](x$1:
> > >> > >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> > >> > >> >> >> > [ERROR] and  method checkNotNull in object Preconditions
> of
> > >> type
> > >> > >> >> [T](x$1:
> > >> > >> >> >> > T, x$2: Any)T
> > >> > >> >> >> > [ERROR] match argument types ((L, R) => O,String)
> > >> > >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join
> function
> > >> must
> > >> > >> not be
> > >> > >> >> >> > null.")
> > >> > >> >> >> >
> > >> > >> >> >> > Same error I see for all of the Scala classes I changed.
> > Any
> > >> > >> pointers
> > >> > >> >> >> here
> > >> > >> >> >> > will be very helpful for me to proceed further. Please
> let
> > me
> > >> > know
> > >> > >> if
> > >> > >> >> you
> > >> > >> >> >> > need more information.
> > >> > >> >> >> >
> > >> > >> >> >> > Thanks in advance for your help and support.
> > >> > >> >> >> >
> > >> > >> >> >> > Thanks,
> > >> > >> >> >> > Lokesh
> > >> > >> >> >>
> > >> > >> >> >>
> > >> > >> >>
> > >> > >>
> > >> >
> > >>
> >
>

Re: Hello Everyone

Posted by Robert Metzger <rm...@apache.org>.
However, you can only restart runs in your travis account, not on the
apache account (also used for validating pull requests).

I have opened a pull request a few minutes ago which will reduce the number
of KafakITCase failures (there is still one other unresolved issue).

On Thu, May 14, 2015 at 5:37 PM, Aljoscha Krettek <al...@apache.org>
wrote:

> Hi,
> don't worry, there are very few stupid questions. :D
>
> The KafkaITCase sometimes fails on Travis, this is a known problem
> currently. On travis you can restart the individual runs for a commit
> in the view of the failed run.
>
> Hope that helps.
>
> Cheers,
> Aljoscha
>
> On Thu, May 14, 2015 at 5:35 PM, Lokesh Rajaram
> <ra...@gmail.com> wrote:
> > Thanks Aljoscha, Robert. After adding guava dependency for flink-spargel
> I
> > was able to progress further but now it's failing in
> > flink-streaming-connectors for the following test case:
> >
> > KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed with: Job
> > execution failed.
> >
> > Any pointers would help me proceed further. Sorry for a lot of trivial
> > questions, I am just getting started not familiar with the code base.
> > I tried running locally, I am able to run it successfully, don't know why
> > it's only failing in Travis build. Not sure if I am missing something in
> my
> > local environment.
> >
> > Thanks,
> > Lokesh
> >
> > On Thu, May 14, 2015 at 1:39 AM, Robert Metzger <rm...@apache.org>
> wrote:
> >
> >> I think flink-spargel is missing the guava dependency.
> >>
> >> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <al...@apache.org>
> >> wrote:
> >>
> >> > @Robert, this seems like a problem with the Shading?
> >> >
> >> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
> >> > <ra...@gmail.com> wrote:
> >> > > Thanks Aljioscha. I was able to change as recommended and able to
> run
> >> the
> >> > > entire test suite in local successfully.
> >> > > However Travis build is failing for pull request:
> >> > > https://github.com/apache/flink/pull/673.
> >> > >
> >> > > It's a compilation failure:
> >> > >
> >> > > [ERROR] Failed to execute goal
> >> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> >> > > (default-compile) on project flink-spargel: Compilation failure:
> >> > > Compilation failure:
> >> > > [ERROR]
> >> > >
> >> >
> >>
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> >> > > package com.google.common.base does not exist
> >> > >
> >> > > I can definitely see the package imported in the class, compiling
> and
> >> > > passing all tests in local.
> >> > > Anything I am missing here?
> >> > >
> >> > > Thanks,
> >> > > Lokesh
> >> > >
> >> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <
> aljoscha@apache.org
> >> >
> >> > > wrote:
> >> > >
> >> > >> I think you can replace Validate.NotNull(p) with require(p !=
> null, "p
> >> > >> is null (or something like this)").
> >> > >>
> >> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> >> > >> <ra...@gmail.com> wrote:
> >> > >> > 1. I think I can use require for replacing Validate.isTrue
> >> > >> > 2. What about Validate.notNull? If require is used it would throw
> >> > >> > IllegalArgumentException,
> >> > >> > if assume or assert is used it would throw AssertionError which
> is
> >> not
> >> > >> > compatible with current implementation.
> >> > >> >
> >> > >> > Please let me know if my understanding is correct. Also, let me
> know
> >> > your
> >> > >> > thoughts.
> >> > >> >
> >> > >> > Thanks,
> >> > >> > Lokesh
> >> > >> >
> >> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
> >> > aljoscha@apache.org>
> >> > >> > wrote:
> >> > >> >
> >> > >> >> I would propose using the methods as Chiwan suggested. If
> everyone
> >> > >> >> agrees I can change the Jira issue.
> >> > >> >>
> >> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> >> > >> >> <ra...@gmail.com> wrote:
> >> > >> >> > Thank you for the reference links. Which approach should I
> take,
> >> > >> casting
> >> > >> >> or
> >> > >> >> > use scala methods.
> >> > >> >> > If it's the latter option will the JIRA ticket FLINK-1711
> >> > >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be
> updated to
> >> > >> >> reflect it?
> >> > >> >> >
> >> > >> >> > Thanks,
> >> > >> >> > Lokesh
> >> > >> >> >
> >> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <
> >> chiwanpark@icloud.com
> >> > >
> >> > >> >> wrote:
> >> > >> >> >
> >> > >> >> >> Hi. There is some problems using Guava’s check method in
> Scala.
> >> (
> >> > >> >> >>
> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> >> > <
> >> > >> >> >>
> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> >> > >)
> >> > >> You
> >> > >> >> >> can solve this error simply with casting last argument to
> >> > >> >> java.lang.Object.
> >> > >> >> >> But I think we’d better use `require`, `assume`, `assert`
> method
> >> > >> >> provided
> >> > >> >> >> by Scala. (
> >> > >> >> >>
> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> >> > <
> >> > >> >> >>
> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> >> > >)
> >> > >> >> >> Because this changes affects many other codes, so we should
> >> > discuss
> >> > >> >> about
> >> > >> >> >> changing Guava's method to Scala’s method.
> >> > >> >> >>
> >> > >> >> >> Regards.
> >> > >> >> >> Chiwan Park (Sent with iPhone)
> >> > >> >> >>
> >> > >> >> >>
> >> > >> >> >>
> >> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> >> > >> >> rajaram.lokesh@gmail.com>
> >> > >> >> >> wrote:
> >> > >> >> >> >
> >> > >> >> >> > Hello All,
> >> > >> >> >> >
> >> > >> >> >> > I am new to Flink community and am very excited about the
> >> > project
> >> > >> and
> >> > >> >> >> work
> >> > >> >> >> > you all have been doing. Kudos!!
> >> > >> >> >> >
> >> > >> >> >> > I was looking to pickup some starter task. Robert
> recommended
> >> to
> >> > >> pick
> >> > >> >> up
> >> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks
> >> Robert
> >> > >> for
> >> > >> >> your
> >> > >> >> >> > guidance.
> >> > >> >> >> >
> >> > >> >> >> > Sorry for a dumb question. I am done with code changes but
> my
> >> > "mvn
> >> > >> >> >> verify"
> >> > >> >> >> > failing only for the scala module as follows
> >> > >> >> >> >
> >> > >> >> >> >
> >> > >> >> >>
> >> > >> >>
> >> > >>
> >> >
> >>
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> >> > >> >> >> > error: ambiguous reference to overloaded definition,
> >> > >> >> >> > [ERROR] both method checkNotNull in object Preconditions of
> >> type
> >> > >> >> [T](x$1:
> >> > >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> >> > >> >> >> > [ERROR] and  method checkNotNull in object Preconditions of
> >> type
> >> > >> >> [T](x$1:
> >> > >> >> >> > T, x$2: Any)T
> >> > >> >> >> > [ERROR] match argument types ((L, R) => O,String)
> >> > >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function
> >> must
> >> > >> not be
> >> > >> >> >> > null.")
> >> > >> >> >> >
> >> > >> >> >> > Same error I see for all of the Scala classes I changed.
> Any
> >> > >> pointers
> >> > >> >> >> here
> >> > >> >> >> > will be very helpful for me to proceed further. Please let
> me
> >> > know
> >> > >> if
> >> > >> >> you
> >> > >> >> >> > need more information.
> >> > >> >> >> >
> >> > >> >> >> > Thanks in advance for your help and support.
> >> > >> >> >> >
> >> > >> >> >> > Thanks,
> >> > >> >> >> > Lokesh
> >> > >> >> >>
> >> > >> >> >>
> >> > >> >>
> >> > >>
> >> >
> >>
>

Re: Hello Everyone

Posted by Aljoscha Krettek <al...@apache.org>.
Hi,
don't worry, there are very few stupid questions. :D

The KafkaITCase sometimes fails on Travis, this is a known problem
currently. On travis you can restart the individual runs for a commit
in the view of the failed run.

Hope that helps.

Cheers,
Aljoscha

On Thu, May 14, 2015 at 5:35 PM, Lokesh Rajaram
<ra...@gmail.com> wrote:
> Thanks Aljoscha, Robert. After adding guava dependency for flink-spargel I
> was able to progress further but now it's failing in
> flink-streaming-connectors for the following test case:
>
> KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed with: Job
> execution failed.
>
> Any pointers would help me proceed further. Sorry for a lot of trivial
> questions, I am just getting started not familiar with the code base.
> I tried running locally, I am able to run it successfully, don't know why
> it's only failing in Travis build. Not sure if I am missing something in my
> local environment.
>
> Thanks,
> Lokesh
>
> On Thu, May 14, 2015 at 1:39 AM, Robert Metzger <rm...@apache.org> wrote:
>
>> I think flink-spargel is missing the guava dependency.
>>
>> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <al...@apache.org>
>> wrote:
>>
>> > @Robert, this seems like a problem with the Shading?
>> >
>> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
>> > <ra...@gmail.com> wrote:
>> > > Thanks Aljioscha. I was able to change as recommended and able to run
>> the
>> > > entire test suite in local successfully.
>> > > However Travis build is failing for pull request:
>> > > https://github.com/apache/flink/pull/673.
>> > >
>> > > It's a compilation failure:
>> > >
>> > > [ERROR] Failed to execute goal
>> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
>> > > (default-compile) on project flink-spargel: Compilation failure:
>> > > Compilation failure:
>> > > [ERROR]
>> > >
>> >
>> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
>> > > package com.google.common.base does not exist
>> > >
>> > > I can definitely see the package imported in the class, compiling and
>> > > passing all tests in local.
>> > > Anything I am missing here?
>> > >
>> > > Thanks,
>> > > Lokesh
>> > >
>> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <aljoscha@apache.org
>> >
>> > > wrote:
>> > >
>> > >> I think you can replace Validate.NotNull(p) with require(p != null, "p
>> > >> is null (or something like this)").
>> > >>
>> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
>> > >> <ra...@gmail.com> wrote:
>> > >> > 1. I think I can use require for replacing Validate.isTrue
>> > >> > 2. What about Validate.notNull? If require is used it would throw
>> > >> > IllegalArgumentException,
>> > >> > if assume or assert is used it would throw AssertionError which is
>> not
>> > >> > compatible with current implementation.
>> > >> >
>> > >> > Please let me know if my understanding is correct. Also, let me know
>> > your
>> > >> > thoughts.
>> > >> >
>> > >> > Thanks,
>> > >> > Lokesh
>> > >> >
>> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
>> > aljoscha@apache.org>
>> > >> > wrote:
>> > >> >
>> > >> >> I would propose using the methods as Chiwan suggested. If everyone
>> > >> >> agrees I can change the Jira issue.
>> > >> >>
>> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
>> > >> >> <ra...@gmail.com> wrote:
>> > >> >> > Thank you for the reference links. Which approach should I take,
>> > >> casting
>> > >> >> or
>> > >> >> > use scala methods.
>> > >> >> > If it's the latter option will the JIRA ticket FLINK-1711
>> > >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
>> > >> >> reflect it?
>> > >> >> >
>> > >> >> > Thanks,
>> > >> >> > Lokesh
>> > >> >> >
>> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <
>> chiwanpark@icloud.com
>> > >
>> > >> >> wrote:
>> > >> >> >
>> > >> >> >> Hi. There is some problems using Guava’s check method in Scala.
>> (
>> > >> >> >>
>> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
>> > <
>> > >> >> >>
>> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
>> > >)
>> > >> You
>> > >> >> >> can solve this error simply with casting last argument to
>> > >> >> java.lang.Object.
>> > >> >> >> But I think we’d better use `require`, `assume`, `assert` method
>> > >> >> provided
>> > >> >> >> by Scala. (
>> > >> >> >>
>> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
>> > <
>> > >> >> >>
>> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
>> > >)
>> > >> >> >> Because this changes affects many other codes, so we should
>> > discuss
>> > >> >> about
>> > >> >> >> changing Guava's method to Scala’s method.
>> > >> >> >>
>> > >> >> >> Regards.
>> > >> >> >> Chiwan Park (Sent with iPhone)
>> > >> >> >>
>> > >> >> >>
>> > >> >> >>
>> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
>> > >> >> rajaram.lokesh@gmail.com>
>> > >> >> >> wrote:
>> > >> >> >> >
>> > >> >> >> > Hello All,
>> > >> >> >> >
>> > >> >> >> > I am new to Flink community and am very excited about the
>> > project
>> > >> and
>> > >> >> >> work
>> > >> >> >> > you all have been doing. Kudos!!
>> > >> >> >> >
>> > >> >> >> > I was looking to pickup some starter task. Robert recommended
>> to
>> > >> pick
>> > >> >> up
>> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks
>> Robert
>> > >> for
>> > >> >> your
>> > >> >> >> > guidance.
>> > >> >> >> >
>> > >> >> >> > Sorry for a dumb question. I am done with code changes but my
>> > "mvn
>> > >> >> >> verify"
>> > >> >> >> > failing only for the scala module as follows
>> > >> >> >> >
>> > >> >> >> >
>> > >> >> >>
>> > >> >>
>> > >>
>> >
>> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
>> > >> >> >> > error: ambiguous reference to overloaded definition,
>> > >> >> >> > [ERROR] both method checkNotNull in object Preconditions of
>> type
>> > >> >> [T](x$1:
>> > >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
>> > >> >> >> > [ERROR] and  method checkNotNull in object Preconditions of
>> type
>> > >> >> [T](x$1:
>> > >> >> >> > T, x$2: Any)T
>> > >> >> >> > [ERROR] match argument types ((L, R) => O,String)
>> > >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function
>> must
>> > >> not be
>> > >> >> >> > null.")
>> > >> >> >> >
>> > >> >> >> > Same error I see for all of the Scala classes I changed. Any
>> > >> pointers
>> > >> >> >> here
>> > >> >> >> > will be very helpful for me to proceed further. Please let me
>> > know
>> > >> if
>> > >> >> you
>> > >> >> >> > need more information.
>> > >> >> >> >
>> > >> >> >> > Thanks in advance for your help and support.
>> > >> >> >> >
>> > >> >> >> > Thanks,
>> > >> >> >> > Lokesh
>> > >> >> >>
>> > >> >> >>
>> > >> >>
>> > >>
>> >
>>

Re: Hello Everyone

Posted by Lokesh Rajaram <ra...@gmail.com>.
Thanks Aljoscha, Robert. After adding guava dependency for flink-spargel I
was able to progress further but now it's failing in
flink-streaming-connectors for the following test case:

KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed with: Job
execution failed.

Any pointers would help me proceed further. Sorry for a lot of trivial
questions, I am just getting started not familiar with the code base.
I tried running locally, I am able to run it successfully, don't know why
it's only failing in Travis build. Not sure if I am missing something in my
local environment.

Thanks,
Lokesh

On Thu, May 14, 2015 at 1:39 AM, Robert Metzger <rm...@apache.org> wrote:

> I think flink-spargel is missing the guava dependency.
>
> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <al...@apache.org>
> wrote:
>
> > @Robert, this seems like a problem with the Shading?
> >
> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
> > <ra...@gmail.com> wrote:
> > > Thanks Aljioscha. I was able to change as recommended and able to run
> the
> > > entire test suite in local successfully.
> > > However Travis build is failing for pull request:
> > > https://github.com/apache/flink/pull/673.
> > >
> > > It's a compilation failure:
> > >
> > > [ERROR] Failed to execute goal
> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> > > (default-compile) on project flink-spargel: Compilation failure:
> > > Compilation failure:
> > > [ERROR]
> > >
> >
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> > > package com.google.common.base does not exist
> > >
> > > I can definitely see the package imported in the class, compiling and
> > > passing all tests in local.
> > > Anything I am missing here?
> > >
> > > Thanks,
> > > Lokesh
> > >
> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <aljoscha@apache.org
> >
> > > wrote:
> > >
> > >> I think you can replace Validate.NotNull(p) with require(p != null, "p
> > >> is null (or something like this)").
> > >>
> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> > >> <ra...@gmail.com> wrote:
> > >> > 1. I think I can use require for replacing Validate.isTrue
> > >> > 2. What about Validate.notNull? If require is used it would throw
> > >> > IllegalArgumentException,
> > >> > if assume or assert is used it would throw AssertionError which is
> not
> > >> > compatible with current implementation.
> > >> >
> > >> > Please let me know if my understanding is correct. Also, let me know
> > your
> > >> > thoughts.
> > >> >
> > >> > Thanks,
> > >> > Lokesh
> > >> >
> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
> > aljoscha@apache.org>
> > >> > wrote:
> > >> >
> > >> >> I would propose using the methods as Chiwan suggested. If everyone
> > >> >> agrees I can change the Jira issue.
> > >> >>
> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> > >> >> <ra...@gmail.com> wrote:
> > >> >> > Thank you for the reference links. Which approach should I take,
> > >> casting
> > >> >> or
> > >> >> > use scala methods.
> > >> >> > If it's the latter option will the JIRA ticket FLINK-1711
> > >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
> > >> >> reflect it?
> > >> >> >
> > >> >> > Thanks,
> > >> >> > Lokesh
> > >> >> >
> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <
> chiwanpark@icloud.com
> > >
> > >> >> wrote:
> > >> >> >
> > >> >> >> Hi. There is some problems using Guava’s check method in Scala.
> (
> > >> >> >>
> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > <
> > >> >> >>
> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> > >)
> > >> You
> > >> >> >> can solve this error simply with casting last argument to
> > >> >> java.lang.Object.
> > >> >> >> But I think we’d better use `require`, `assume`, `assert` method
> > >> >> provided
> > >> >> >> by Scala. (
> > >> >> >>
> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > <
> > >> >> >>
> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> > >)
> > >> >> >> Because this changes affects many other codes, so we should
> > discuss
> > >> >> about
> > >> >> >> changing Guava's method to Scala’s method.
> > >> >> >>
> > >> >> >> Regards.
> > >> >> >> Chiwan Park (Sent with iPhone)
> > >> >> >>
> > >> >> >>
> > >> >> >>
> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> > >> >> rajaram.lokesh@gmail.com>
> > >> >> >> wrote:
> > >> >> >> >
> > >> >> >> > Hello All,
> > >> >> >> >
> > >> >> >> > I am new to Flink community and am very excited about the
> > project
> > >> and
> > >> >> >> work
> > >> >> >> > you all have been doing. Kudos!!
> > >> >> >> >
> > >> >> >> > I was looking to pickup some starter task. Robert recommended
> to
> > >> pick
> > >> >> up
> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks
> Robert
> > >> for
> > >> >> your
> > >> >> >> > guidance.
> > >> >> >> >
> > >> >> >> > Sorry for a dumb question. I am done with code changes but my
> > "mvn
> > >> >> >> verify"
> > >> >> >> > failing only for the scala module as follows
> > >> >> >> >
> > >> >> >> >
> > >> >> >>
> > >> >>
> > >>
> >
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> > >> >> >> > error: ambiguous reference to overloaded definition,
> > >> >> >> > [ERROR] both method checkNotNull in object Preconditions of
> type
> > >> >> [T](x$1:
> > >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> > >> >> >> > [ERROR] and  method checkNotNull in object Preconditions of
> type
> > >> >> [T](x$1:
> > >> >> >> > T, x$2: Any)T
> > >> >> >> > [ERROR] match argument types ((L, R) => O,String)
> > >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function
> must
> > >> not be
> > >> >> >> > null.")
> > >> >> >> >
> > >> >> >> > Same error I see for all of the Scala classes I changed. Any
> > >> pointers
> > >> >> >> here
> > >> >> >> > will be very helpful for me to proceed further. Please let me
> > know
> > >> if
> > >> >> you
> > >> >> >> > need more information.
> > >> >> >> >
> > >> >> >> > Thanks in advance for your help and support.
> > >> >> >> >
> > >> >> >> > Thanks,
> > >> >> >> > Lokesh
> > >> >> >>
> > >> >> >>
> > >> >>
> > >>
> >
>

Re: Hello Everyone

Posted by Robert Metzger <rm...@apache.org>.
I think flink-spargel is missing the guava dependency.

On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <al...@apache.org>
wrote:

> @Robert, this seems like a problem with the Shading?
>
> On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
> <ra...@gmail.com> wrote:
> > Thanks Aljioscha. I was able to change as recommended and able to run the
> > entire test suite in local successfully.
> > However Travis build is failing for pull request:
> > https://github.com/apache/flink/pull/673.
> >
> > It's a compilation failure:
> >
> > [ERROR] Failed to execute goal
> > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> > (default-compile) on project flink-spargel: Compilation failure:
> > Compilation failure:
> > [ERROR]
> >
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> > package com.google.common.base does not exist
> >
> > I can definitely see the package imported in the class, compiling and
> > passing all tests in local.
> > Anything I am missing here?
> >
> > Thanks,
> > Lokesh
> >
> > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <al...@apache.org>
> > wrote:
> >
> >> I think you can replace Validate.NotNull(p) with require(p != null, "p
> >> is null (or something like this)").
> >>
> >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> >> <ra...@gmail.com> wrote:
> >> > 1. I think I can use require for replacing Validate.isTrue
> >> > 2. What about Validate.notNull? If require is used it would throw
> >> > IllegalArgumentException,
> >> > if assume or assert is used it would throw AssertionError which is not
> >> > compatible with current implementation.
> >> >
> >> > Please let me know if my understanding is correct. Also, let me know
> your
> >> > thoughts.
> >> >
> >> > Thanks,
> >> > Lokesh
> >> >
> >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
> aljoscha@apache.org>
> >> > wrote:
> >> >
> >> >> I would propose using the methods as Chiwan suggested. If everyone
> >> >> agrees I can change the Jira issue.
> >> >>
> >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> >> >> <ra...@gmail.com> wrote:
> >> >> > Thank you for the reference links. Which approach should I take,
> >> casting
> >> >> or
> >> >> > use scala methods.
> >> >> > If it's the latter option will the JIRA ticket FLINK-1711
> >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
> >> >> reflect it?
> >> >> >
> >> >> > Thanks,
> >> >> > Lokesh
> >> >> >
> >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <chiwanpark@icloud.com
> >
> >> >> wrote:
> >> >> >
> >> >> >> Hi. There is some problems using Guava’s check method in Scala. (
> >> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> <
> >> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> >)
> >> You
> >> >> >> can solve this error simply with casting last argument to
> >> >> java.lang.Object.
> >> >> >> But I think we’d better use `require`, `assume`, `assert` method
> >> >> provided
> >> >> >> by Scala. (
> >> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> <
> >> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> >)
> >> >> >> Because this changes affects many other codes, so we should
> discuss
> >> >> about
> >> >> >> changing Guava's method to Scala’s method.
> >> >> >>
> >> >> >> Regards.
> >> >> >> Chiwan Park (Sent with iPhone)
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> >> >> rajaram.lokesh@gmail.com>
> >> >> >> wrote:
> >> >> >> >
> >> >> >> > Hello All,
> >> >> >> >
> >> >> >> > I am new to Flink community and am very excited about the
> project
> >> and
> >> >> >> work
> >> >> >> > you all have been doing. Kudos!!
> >> >> >> >
> >> >> >> > I was looking to pickup some starter task. Robert recommended to
> >> pick
> >> >> up
> >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert
> >> for
> >> >> your
> >> >> >> > guidance.
> >> >> >> >
> >> >> >> > Sorry for a dumb question. I am done with code changes but my
> "mvn
> >> >> >> verify"
> >> >> >> > failing only for the scala module as follows
> >> >> >> >
> >> >> >> >
> >> >> >>
> >> >>
> >>
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> >> >> >> > error: ambiguous reference to overloaded definition,
> >> >> >> > [ERROR] both method checkNotNull in object Preconditions of type
> >> >> [T](x$1:
> >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> >> >> >> > [ERROR] and  method checkNotNull in object Preconditions of type
> >> >> [T](x$1:
> >> >> >> > T, x$2: Any)T
> >> >> >> > [ERROR] match argument types ((L, R) => O,String)
> >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must
> >> not be
> >> >> >> > null.")
> >> >> >> >
> >> >> >> > Same error I see for all of the Scala classes I changed. Any
> >> pointers
> >> >> >> here
> >> >> >> > will be very helpful for me to proceed further. Please let me
> know
> >> if
> >> >> you
> >> >> >> > need more information.
> >> >> >> >
> >> >> >> > Thanks in advance for your help and support.
> >> >> >> >
> >> >> >> > Thanks,
> >> >> >> > Lokesh
> >> >> >>
> >> >> >>
> >> >>
> >>
>

Re: Hello Everyone

Posted by Aljoscha Krettek <al...@apache.org>.
@Robert, this seems like a problem with the Shading?

On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
<ra...@gmail.com> wrote:
> Thanks Aljioscha. I was able to change as recommended and able to run the
> entire test suite in local successfully.
> However Travis build is failing for pull request:
> https://github.com/apache/flink/pull/673.
>
> It's a compilation failure:
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> (default-compile) on project flink-spargel: Compilation failure:
> Compilation failure:
> [ERROR]
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> package com.google.common.base does not exist
>
> I can definitely see the package imported in the class, compiling and
> passing all tests in local.
> Anything I am missing here?
>
> Thanks,
> Lokesh
>
> On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <al...@apache.org>
> wrote:
>
>> I think you can replace Validate.NotNull(p) with require(p != null, "p
>> is null (or something like this)").
>>
>> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
>> <ra...@gmail.com> wrote:
>> > 1. I think I can use require for replacing Validate.isTrue
>> > 2. What about Validate.notNull? If require is used it would throw
>> > IllegalArgumentException,
>> > if assume or assert is used it would throw AssertionError which is not
>> > compatible with current implementation.
>> >
>> > Please let me know if my understanding is correct. Also, let me know your
>> > thoughts.
>> >
>> > Thanks,
>> > Lokesh
>> >
>> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <al...@apache.org>
>> > wrote:
>> >
>> >> I would propose using the methods as Chiwan suggested. If everyone
>> >> agrees I can change the Jira issue.
>> >>
>> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
>> >> <ra...@gmail.com> wrote:
>> >> > Thank you for the reference links. Which approach should I take,
>> casting
>> >> or
>> >> > use scala methods.
>> >> > If it's the latter option will the JIRA ticket FLINK-1711
>> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
>> >> reflect it?
>> >> >
>> >> > Thanks,
>> >> > Lokesh
>> >> >
>> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <ch...@icloud.com>
>> >> wrote:
>> >> >
>> >> >> Hi. There is some problems using Guava’s check method in Scala. (
>> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k <
>> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k>)
>> You
>> >> >> can solve this error simply with casting last argument to
>> >> java.lang.Object.
>> >> >> But I think we’d better use `require`, `assume`, `assert` method
>> >> provided
>> >> >> by Scala. (
>> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html <
>> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html>)
>> >> >> Because this changes affects many other codes, so we should discuss
>> >> about
>> >> >> changing Guava's method to Scala’s method.
>> >> >>
>> >> >> Regards.
>> >> >> Chiwan Park (Sent with iPhone)
>> >> >>
>> >> >>
>> >> >>
>> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
>> >> rajaram.lokesh@gmail.com>
>> >> >> wrote:
>> >> >> >
>> >> >> > Hello All,
>> >> >> >
>> >> >> > I am new to Flink community and am very excited about the project
>> and
>> >> >> work
>> >> >> > you all have been doing. Kudos!!
>> >> >> >
>> >> >> > I was looking to pickup some starter task. Robert recommended to
>> pick
>> >> up
>> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert
>> for
>> >> your
>> >> >> > guidance.
>> >> >> >
>> >> >> > Sorry for a dumb question. I am done with code changes but my "mvn
>> >> >> verify"
>> >> >> > failing only for the scala module as follows
>> >> >> >
>> >> >> >
>> >> >>
>> >>
>> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
>> >> >> > error: ambiguous reference to overloaded definition,
>> >> >> > [ERROR] both method checkNotNull in object Preconditions of type
>> >> [T](x$1:
>> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
>> >> >> > [ERROR] and  method checkNotNull in object Preconditions of type
>> >> [T](x$1:
>> >> >> > T, x$2: Any)T
>> >> >> > [ERROR] match argument types ((L, R) => O,String)
>> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must
>> not be
>> >> >> > null.")
>> >> >> >
>> >> >> > Same error I see for all of the Scala classes I changed. Any
>> pointers
>> >> >> here
>> >> >> > will be very helpful for me to proceed further. Please let me know
>> if
>> >> you
>> >> >> > need more information.
>> >> >> >
>> >> >> > Thanks in advance for your help and support.
>> >> >> >
>> >> >> > Thanks,
>> >> >> > Lokesh
>> >> >>
>> >> >>
>> >>
>>

Re: Hello Everyone

Posted by Lokesh Rajaram <ra...@gmail.com>.
Thanks Aljioscha. I was able to change as recommended and able to run the
entire test suite in local successfully.
However Travis build is failing for pull request:
https://github.com/apache/flink/pull/673.

It's a compilation failure:

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
(default-compile) on project flink-spargel: Compilation failure:
Compilation failure:
[ERROR]
/home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
package com.google.common.base does not exist

I can definitely see the package imported in the class, compiling and
passing all tests in local.
Anything I am missing here?

Thanks,
Lokesh

On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <al...@apache.org>
wrote:

> I think you can replace Validate.NotNull(p) with require(p != null, "p
> is null (or something like this)").
>
> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> <ra...@gmail.com> wrote:
> > 1. I think I can use require for replacing Validate.isTrue
> > 2. What about Validate.notNull? If require is used it would throw
> > IllegalArgumentException,
> > if assume or assert is used it would throw AssertionError which is not
> > compatible with current implementation.
> >
> > Please let me know if my understanding is correct. Also, let me know your
> > thoughts.
> >
> > Thanks,
> > Lokesh
> >
> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <al...@apache.org>
> > wrote:
> >
> >> I would propose using the methods as Chiwan suggested. If everyone
> >> agrees I can change the Jira issue.
> >>
> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> >> <ra...@gmail.com> wrote:
> >> > Thank you for the reference links. Which approach should I take,
> casting
> >> or
> >> > use scala methods.
> >> > If it's the latter option will the JIRA ticket FLINK-1711
> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
> >> reflect it?
> >> >
> >> > Thanks,
> >> > Lokesh
> >> >
> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <ch...@icloud.com>
> >> wrote:
> >> >
> >> >> Hi. There is some problems using Guava’s check method in Scala. (
> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k <
> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k>)
> You
> >> >> can solve this error simply with casting last argument to
> >> java.lang.Object.
> >> >> But I think we’d better use `require`, `assume`, `assert` method
> >> provided
> >> >> by Scala. (
> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html <
> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html>)
> >> >> Because this changes affects many other codes, so we should discuss
> >> about
> >> >> changing Guava's method to Scala’s method.
> >> >>
> >> >> Regards.
> >> >> Chiwan Park (Sent with iPhone)
> >> >>
> >> >>
> >> >>
> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> >> rajaram.lokesh@gmail.com>
> >> >> wrote:
> >> >> >
> >> >> > Hello All,
> >> >> >
> >> >> > I am new to Flink community and am very excited about the project
> and
> >> >> work
> >> >> > you all have been doing. Kudos!!
> >> >> >
> >> >> > I was looking to pickup some starter task. Robert recommended to
> pick
> >> up
> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert
> for
> >> your
> >> >> > guidance.
> >> >> >
> >> >> > Sorry for a dumb question. I am done with code changes but my "mvn
> >> >> verify"
> >> >> > failing only for the scala module as follows
> >> >> >
> >> >> >
> >> >>
> >>
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> >> >> > error: ambiguous reference to overloaded definition,
> >> >> > [ERROR] both method checkNotNull in object Preconditions of type
> >> [T](x$1:
> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> >> >> > [ERROR] and  method checkNotNull in object Preconditions of type
> >> [T](x$1:
> >> >> > T, x$2: Any)T
> >> >> > [ERROR] match argument types ((L, R) => O,String)
> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must
> not be
> >> >> > null.")
> >> >> >
> >> >> > Same error I see for all of the Scala classes I changed. Any
> pointers
> >> >> here
> >> >> > will be very helpful for me to proceed further. Please let me know
> if
> >> you
> >> >> > need more information.
> >> >> >
> >> >> > Thanks in advance for your help and support.
> >> >> >
> >> >> > Thanks,
> >> >> > Lokesh
> >> >>
> >> >>
> >>
>

Re: Hello Everyone

Posted by Aljoscha Krettek <al...@apache.org>.
I think you can replace Validate.NotNull(p) with require(p != null, "p
is null (or something like this)").

On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
<ra...@gmail.com> wrote:
> 1. I think I can use require for replacing Validate.isTrue
> 2. What about Validate.notNull? If require is used it would throw
> IllegalArgumentException,
> if assume or assert is used it would throw AssertionError which is not
> compatible with current implementation.
>
> Please let me know if my understanding is correct. Also, let me know your
> thoughts.
>
> Thanks,
> Lokesh
>
> On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <al...@apache.org>
> wrote:
>
>> I would propose using the methods as Chiwan suggested. If everyone
>> agrees I can change the Jira issue.
>>
>> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
>> <ra...@gmail.com> wrote:
>> > Thank you for the reference links. Which approach should I take, casting
>> or
>> > use scala methods.
>> > If it's the latter option will the JIRA ticket FLINK-1711
>> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
>> reflect it?
>> >
>> > Thanks,
>> > Lokesh
>> >
>> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <ch...@icloud.com>
>> wrote:
>> >
>> >> Hi. There is some problems using Guava’s check method in Scala. (
>> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k <
>> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k>) You
>> >> can solve this error simply with casting last argument to
>> java.lang.Object.
>> >> But I think we’d better use `require`, `assume`, `assert` method
>> provided
>> >> by Scala. (
>> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html <
>> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html>)
>> >> Because this changes affects many other codes, so we should discuss
>> about
>> >> changing Guava's method to Scala’s method.
>> >>
>> >> Regards.
>> >> Chiwan Park (Sent with iPhone)
>> >>
>> >>
>> >>
>> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
>> rajaram.lokesh@gmail.com>
>> >> wrote:
>> >> >
>> >> > Hello All,
>> >> >
>> >> > I am new to Flink community and am very excited about the project and
>> >> work
>> >> > you all have been doing. Kudos!!
>> >> >
>> >> > I was looking to pickup some starter task. Robert recommended to pick
>> up
>> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert for
>> your
>> >> > guidance.
>> >> >
>> >> > Sorry for a dumb question. I am done with code changes but my "mvn
>> >> verify"
>> >> > failing only for the scala module as follows
>> >> >
>> >> >
>> >>
>> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
>> >> > error: ambiguous reference to overloaded definition,
>> >> > [ERROR] both method checkNotNull in object Preconditions of type
>> [T](x$1:
>> >> > T, x$2: String, x$3: <repeated...>[Object])T
>> >> > [ERROR] and  method checkNotNull in object Preconditions of type
>> [T](x$1:
>> >> > T, x$2: Any)T
>> >> > [ERROR] match argument types ((L, R) => O,String)
>> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must not be
>> >> > null.")
>> >> >
>> >> > Same error I see for all of the Scala classes I changed. Any pointers
>> >> here
>> >> > will be very helpful for me to proceed further. Please let me know if
>> you
>> >> > need more information.
>> >> >
>> >> > Thanks in advance for your help and support.
>> >> >
>> >> > Thanks,
>> >> > Lokesh
>> >>
>> >>
>>

Re: Hello Everyone

Posted by Lokesh Rajaram <ra...@gmail.com>.
1. I think I can use require for replacing Validate.isTrue
2. What about Validate.notNull? If require is used it would throw
IllegalArgumentException,
if assume or assert is used it would throw AssertionError which is not
compatible with current implementation.

Please let me know if my understanding is correct. Also, let me know your
thoughts.

Thanks,
Lokesh

On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <al...@apache.org>
wrote:

> I would propose using the methods as Chiwan suggested. If everyone
> agrees I can change the Jira issue.
>
> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> <ra...@gmail.com> wrote:
> > Thank you for the reference links. Which approach should I take, casting
> or
> > use scala methods.
> > If it's the latter option will the JIRA ticket FLINK-1711
> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
> reflect it?
> >
> > Thanks,
> > Lokesh
> >
> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <ch...@icloud.com>
> wrote:
> >
> >> Hi. There is some problems using Guava’s check method in Scala. (
> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k <
> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k>) You
> >> can solve this error simply with casting last argument to
> java.lang.Object.
> >> But I think we’d better use `require`, `assume`, `assert` method
> provided
> >> by Scala. (
> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html <
> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html>)
> >> Because this changes affects many other codes, so we should discuss
> about
> >> changing Guava's method to Scala’s method.
> >>
> >> Regards.
> >> Chiwan Park (Sent with iPhone)
> >>
> >>
> >>
> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> rajaram.lokesh@gmail.com>
> >> wrote:
> >> >
> >> > Hello All,
> >> >
> >> > I am new to Flink community and am very excited about the project and
> >> work
> >> > you all have been doing. Kudos!!
> >> >
> >> > I was looking to pickup some starter task. Robert recommended to pick
> up
> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert for
> your
> >> > guidance.
> >> >
> >> > Sorry for a dumb question. I am done with code changes but my "mvn
> >> verify"
> >> > failing only for the scala module as follows
> >> >
> >> >
> >>
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> >> > error: ambiguous reference to overloaded definition,
> >> > [ERROR] both method checkNotNull in object Preconditions of type
> [T](x$1:
> >> > T, x$2: String, x$3: <repeated...>[Object])T
> >> > [ERROR] and  method checkNotNull in object Preconditions of type
> [T](x$1:
> >> > T, x$2: Any)T
> >> > [ERROR] match argument types ((L, R) => O,String)
> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must not be
> >> > null.")
> >> >
> >> > Same error I see for all of the Scala classes I changed. Any pointers
> >> here
> >> > will be very helpful for me to proceed further. Please let me know if
> you
> >> > need more information.
> >> >
> >> > Thanks in advance for your help and support.
> >> >
> >> > Thanks,
> >> > Lokesh
> >>
> >>
>

Re: Hello Everyone

Posted by Aljoscha Krettek <al...@apache.org>.
I would propose using the methods as Chiwan suggested. If everyone
agrees I can change the Jira issue.

On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
<ra...@gmail.com> wrote:
> Thank you for the reference links. Which approach should I take, casting or
> use scala methods.
> If it's the latter option will the JIRA ticket FLINK-1711
> <https://issues.apache.org/jira/browse/FLINK-1711> be updated to reflect it?
>
> Thanks,
> Lokesh
>
> On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <ch...@icloud.com> wrote:
>
>> Hi. There is some problems using Guava’s check method in Scala. (
>> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k <
>> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k>) You
>> can solve this error simply with casting last argument to java.lang.Object.
>> But I think we’d better use `require`, `assume`, `assert` method provided
>> by Scala. (
>> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html <
>> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html>)
>> Because this changes affects many other codes, so we should discuss about
>> changing Guava's method to Scala’s method.
>>
>> Regards.
>> Chiwan Park (Sent with iPhone)
>>
>>
>>
>> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <ra...@gmail.com>
>> wrote:
>> >
>> > Hello All,
>> >
>> > I am new to Flink community and am very excited about the project and
>> work
>> > you all have been doing. Kudos!!
>> >
>> > I was looking to pickup some starter task. Robert recommended to pick up
>> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert for your
>> > guidance.
>> >
>> > Sorry for a dumb question. I am done with code changes but my "mvn
>> verify"
>> > failing only for the scala module as follows
>> >
>> >
>> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
>> > error: ambiguous reference to overloaded definition,
>> > [ERROR] both method checkNotNull in object Preconditions of type [T](x$1:
>> > T, x$2: String, x$3: <repeated...>[Object])T
>> > [ERROR] and  method checkNotNull in object Preconditions of type [T](x$1:
>> > T, x$2: Any)T
>> > [ERROR] match argument types ((L, R) => O,String)
>> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must not be
>> > null.")
>> >
>> > Same error I see for all of the Scala classes I changed. Any pointers
>> here
>> > will be very helpful for me to proceed further. Please let me know if you
>> > need more information.
>> >
>> > Thanks in advance for your help and support.
>> >
>> > Thanks,
>> > Lokesh
>>
>>

Re: Hello Everyone

Posted by Lokesh Rajaram <ra...@gmail.com>.
Thank you for the reference links. Which approach should I take, casting or
use scala methods.
If it's the latter option will the JIRA ticket FLINK-1711
<https://issues.apache.org/jira/browse/FLINK-1711> be updated to reflect it?

Thanks,
Lokesh

On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <ch...@icloud.com> wrote:

> Hi. There is some problems using Guava’s check method in Scala. (
> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k <
> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k>) You
> can solve this error simply with casting last argument to java.lang.Object.
> But I think we’d better use `require`, `assume`, `assert` method provided
> by Scala. (
> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html <
> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html>)
> Because this changes affects many other codes, so we should discuss about
> changing Guava's method to Scala’s method.
>
> Regards.
> Chiwan Park (Sent with iPhone)
>
>
>
> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <ra...@gmail.com>
> wrote:
> >
> > Hello All,
> >
> > I am new to Flink community and am very excited about the project and
> work
> > you all have been doing. Kudos!!
> >
> > I was looking to pickup some starter task. Robert recommended to pick up
> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert for your
> > guidance.
> >
> > Sorry for a dumb question. I am done with code changes but my "mvn
> verify"
> > failing only for the scala module as follows
> >
> >
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> > error: ambiguous reference to overloaded definition,
> > [ERROR] both method checkNotNull in object Preconditions of type [T](x$1:
> > T, x$2: String, x$3: <repeated...>[Object])T
> > [ERROR] and  method checkNotNull in object Preconditions of type [T](x$1:
> > T, x$2: Any)T
> > [ERROR] match argument types ((L, R) => O,String)
> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must not be
> > null.")
> >
> > Same error I see for all of the Scala classes I changed. Any pointers
> here
> > will be very helpful for me to proceed further. Please let me know if you
> > need more information.
> >
> > Thanks in advance for your help and support.
> >
> > Thanks,
> > Lokesh
>
>

Re: Hello Everyone

Posted by Chiwan Park <ch...@icloud.com>.
Hi. There is some problems using Guava’s check method in Scala. (https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k <https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k>) You can solve this error simply with casting last argument to java.lang.Object. But I think we’d better use `require`, `assume`, `assert` method provided by Scala. (http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html <http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html>)
Because this changes affects many other codes, so we should discuss about changing Guava's method to Scala’s method.

Regards.
Chiwan Park (Sent with iPhone)



> On May 10, 2015, at 11:49 AM, Lokesh Rajaram <ra...@gmail.com> wrote:
> 
> Hello All,
> 
> I am new to Flink community and am very excited about the project and work
> you all have been doing. Kudos!!
> 
> I was looking to pickup some starter task. Robert recommended to pick up
> https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert for your
> guidance.
> 
> Sorry for a dumb question. I am done with code changes but my "mvn verify"
> failing only for the scala module as follows
> 
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> error: ambiguous reference to overloaded definition,
> [ERROR] both method checkNotNull in object Preconditions of type [T](x$1:
> T, x$2: String, x$3: <repeated...>[Object])T
> [ERROR] and  method checkNotNull in object Preconditions of type [T](x$1:
> T, x$2: Any)T
> [ERROR] match argument types ((L, R) => O,String)
> [ERROR]     Preconditions.checkNotNull(fun, "Join function must not be
> null.")
> 
> Same error I see for all of the Scala classes I changed. Any pointers here
> will be very helpful for me to proceed further. Please let me know if you
> need more information.
> 
> Thanks in advance for your help and support.
> 
> Thanks,
> Lokesh