You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jeff Zhang <zj...@gmail.com> on 2015/11/03 11:50:50 UTC

Master build fails ?

Looks like it's due to guava version conflicts, I see both guava 14.0.1 and
16.0.1 under lib_managed/bundles. Anyone meet this issue too ?

[error]
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
object HashCodes is not a member of package com.google.common.hash
[error] import com.google.common.hash.HashCodes
[error]        ^
[info] Resolving org.apache.commons#commons-math;2.2 ...
[error]
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
not found: value HashCodes
[error]         val cookie = HashCodes.fromBytes(secret).toString()
[error]                      ^




-- 
Best Regards

Jeff Zhang

Re: Master build fails ?

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

It appears it's time to switch to my lovely sbt then!

Pozdrawiam,
Jacek

--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski


On Tue, Nov 3, 2015 at 2:58 PM, Jean-Baptiste Onofré <jb...@nanthrax.net> wrote:
> Hi Jacek,
>
> it works fine with mvn: the problem is with sbt.
>
> I suspect a different reactor order in sbt compare to mvn.
>
> Regards
> JB
>
>
> On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>>
>> Hi,
>>
>> Just built the sources using the following command and it worked fine.
>>
>> ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>> -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>> -DskipTests clean install
>> ...
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] BUILD SUCCESS
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Total time: 14:15 min
>> [INFO] Finished at: 2015-11-03T14:40:40+01:00
>> [INFO] Final Memory: 438M/1972M
>> [INFO]
>> ------------------------------------------------------------------------
>>
>> ➜  spark git:(master) ✗ java -version
>> java version "1.8.0_66"
>> Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>> Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>>
>> I'm on Mac OS.
>>
>> Pozdrawiam,
>> Jacek
>>
>> --
>> Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
>> Follow me at https://twitter.com/jaceklaskowski
>> Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>>
>>
>> On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré <jb...@nanthrax.net>
>> wrote:
>>>
>>> Thanks for the update, I used mvn to build but without hive profile.
>>>
>>> Let me try with mvn with the same options as you and sbt also.
>>>
>>> I keep you posted.
>>>
>>> Regards
>>> JB
>>>
>>> On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>>>
>>>>
>>>> I found it is due to SPARK-11073.
>>>>
>>>> Here's the command I used to build
>>>>
>>>> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>>> -Psparkr
>>>>
>>>> On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <jb@nanthrax.net
>>>> <ma...@nanthrax.net>> wrote:
>>>>
>>>>      Hi Jeff,
>>>>
>>>>      it works for me (with skipping the tests).
>>>>
>>>>      Let me try again, just to be sure.
>>>>
>>>>      Regards
>>>>      JB
>>>>
>>>>
>>>>      On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>>>
>>>>          Looks like it's due to guava version conflicts, I see both
>>>> guava
>>>>          14.0.1
>>>>          and 16.0.1 under lib_managed/bundles. Anyone meet this issue
>>>> too ?
>>>>
>>>>          [error]
>>>>
>>>>
>>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>>>          object HashCodes is not a member of package
>>>> com.google.common.hash
>>>>          [error] import com.google.common.hash.HashCodes
>>>>          [error]        ^
>>>>          [info] Resolving org.apache.commons#commons-math;2.2 ...
>>>>          [error]
>>>>
>>>>
>>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>>>          not found: value HashCodes
>>>>          [error]         val cookie =
>>>> HashCodes.fromBytes(secret).toString()
>>>>          [error]                      ^
>>>>
>>>>
>>>>
>>>>
>>>>          --
>>>>          Best Regards
>>>>
>>>>          Jeff Zhang
>>>>
>>>>
>>>>      --
>>>>      Jean-Baptiste Onofré
>>>>      jbonofre@apache.org <ma...@apache.org>
>>>>      http://blog.nanthrax.net
>>>>      Talend - http://www.talend.com
>>>>
>>>>
>>>> ---------------------------------------------------------------------
>>>>      To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>      <ma...@spark.apache.org>
>>>>      For additional commands, e-mail: dev-help@spark.apache.org
>>>>      <ma...@spark.apache.org>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>
>>>
>>>
>>> --
>>> Jean-Baptiste Onofré
>>> jbonofre@apache.org
>>> http://blog.nanthrax.net
>>> Talend - http://www.talend.com
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>
> --
> Jean-Baptiste Onofré
> jbonofre@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Ted Yu <yu...@gmail.com>.
Since maven is the preferred build vehicle, ivy style dependencies policy
would produce surprising results compared to today's behavior.

I would suggest staying with current dependencies policy.

My two cents.

On Fri, Nov 6, 2015 at 6:25 AM, Koert Kuipers <ko...@tresata.com> wrote:

> if there is no strong preference for one dependencies policy over another,
> but consistency between the 2 systems is desired, then i believe maven can
> be made to behave like ivy pretty easily with a setting in the pom
>
> On Fri, Nov 6, 2015 at 5:21 AM, Steve Loughran <st...@hortonworks.com>
> wrote:
>
>>
>> > On 5 Nov 2015, at 20:07, Marcelo Vanzin <va...@cloudera.com> wrote:
>> >
>> > Man that command is slow. Anyway, it seems guava 16 is being brought
>> > transitively by curator 2.6.0 which should have been overridden by the
>> > explicit dependency on curator 2.4.0, but apparently, as Steve
>> > mentioned, sbt/ivy decided to break things, so I'll be adding some
>> > exclusions.
>> >
>>
>>
>> It's not that ivy is bad per-se, only that it has a different policy, one
>> which holds *provided all later versions of JARs are backwards compatible*
>>
>> Maven's closest-first policy has a different flaw, namely that its not
>> always obvious why a guava 14.0 that is two hops of transitiveness should
>> take priority over a 16.0 version three hops away. Especially when that
>> 0.14 version should have come
>>
>> If you look at the the diffs for the hive 1.2.1 update patch, the final
>> week was pretty much frittered away trying to get the two builds to have
>> consistent versions of things.
>>
>> 1. I should have historical commit rights to ivy, so, transitively to
>> SBT's dependency logic. If someone writes a resolver with the same
>> behaviour as maven2 I'll see about getting it in.
>>
>> 2. Hadoop 2.6 is on curator 2.7.1; HADOOP-11492. To verify it worked
>> against guava 11.02, I ended up compiling curator against that version to
>> see what broke. curator-x-discovery is the only module which doesn't
>> compile against older guava versions (HADOOP-11102)
>>
>> -Steve
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>

Re: Master build fails ?

Posted by Koert Kuipers <ko...@tresata.com>.
if there is no strong preference for one dependencies policy over another,
but consistency between the 2 systems is desired, then i believe maven can
be made to behave like ivy pretty easily with a setting in the pom

On Fri, Nov 6, 2015 at 5:21 AM, Steve Loughran <st...@hortonworks.com>
wrote:

>
> > On 5 Nov 2015, at 20:07, Marcelo Vanzin <va...@cloudera.com> wrote:
> >
> > Man that command is slow. Anyway, it seems guava 16 is being brought
> > transitively by curator 2.6.0 which should have been overridden by the
> > explicit dependency on curator 2.4.0, but apparently, as Steve
> > mentioned, sbt/ivy decided to break things, so I'll be adding some
> > exclusions.
> >
>
>
> It's not that ivy is bad per-se, only that it has a different policy, one
> which holds *provided all later versions of JARs are backwards compatible*
>
> Maven's closest-first policy has a different flaw, namely that its not
> always obvious why a guava 14.0 that is two hops of transitiveness should
> take priority over a 16.0 version three hops away. Especially when that
> 0.14 version should have come
>
> If you look at the the diffs for the hive 1.2.1 update patch, the final
> week was pretty much frittered away trying to get the two builds to have
> consistent versions of things.
>
> 1. I should have historical commit rights to ivy, so, transitively to
> SBT's dependency logic. If someone writes a resolver with the same
> behaviour as maven2 I'll see about getting it in.
>
> 2. Hadoop 2.6 is on curator 2.7.1; HADOOP-11492. To verify it worked
> against guava 11.02, I ended up compiling curator against that version to
> see what broke. curator-x-discovery is the only module which doesn't
> compile against older guava versions (HADOOP-11102)
>
> -Steve
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Master build fails ?

Posted by Steve Loughran <st...@hortonworks.com>.
> On 6 Nov 2015, at 17:35, Marcelo Vanzin <va...@cloudera.com> wrote:
> 
> On Fri, Nov 6, 2015 at 2:21 AM, Steve Loughran <st...@hortonworks.com> wrote:
>> Maven's closest-first policy has a different flaw, namely that its not always obvious why a guava 14.0 that is two hops of transitiveness should take priority over a 16.0 version three hops away. Especially when that 0.14 version should have come
> 
> But that's not the case here; guava is a direct dependency of spark,
> not a transitive one, and the root pom explicitly sets its version to
> 14. sbt is just choosing to ignore that and pick whatever latest
> version exists from transitive analysis.

I agree, that's wrong
> 
> Maven would behave similarly if Spark did not declare a direct
> dependency on guava, but it does.
> 

I think if you have an indirect dependency, it picks one on the shortest path, so if you aren't explicit, you can still lose control of what's going on...

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Fri, Nov 6, 2015 at 2:21 AM, Steve Loughran <st...@hortonworks.com> wrote:
> Maven's closest-first policy has a different flaw, namely that its not always obvious why a guava 14.0 that is two hops of transitiveness should take priority over a 16.0 version three hops away. Especially when that 0.14 version should have come

But that's not the case here; guava is a direct dependency of spark,
not a transitive one, and the root pom explicitly sets its version to
14. sbt is just choosing to ignore that and pick whatever latest
version exists from transitive analysis.

Maven would behave similarly if Spark did not declare a direct
dependency on guava, but it does.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Steve Loughran <st...@hortonworks.com>.
> On 5 Nov 2015, at 20:07, Marcelo Vanzin <va...@cloudera.com> wrote:
> 
> Man that command is slow. Anyway, it seems guava 16 is being brought
> transitively by curator 2.6.0 which should have been overridden by the
> explicit dependency on curator 2.4.0, but apparently, as Steve
> mentioned, sbt/ivy decided to break things, so I'll be adding some
> exclusions.
> 


It's not that ivy is bad per-se, only that it has a different policy, one which holds *provided all later versions of JARs are backwards compatible*

Maven's closest-first policy has a different flaw, namely that its not always obvious why a guava 14.0 that is two hops of transitiveness should take priority over a 16.0 version three hops away. Especially when that 0.14 version should have come

If you look at the the diffs for the hive 1.2.1 update patch, the final week was pretty much frittered away trying to get the two builds to have consistent versions of things.

1. I should have historical commit rights to ivy, so, transitively to SBT's dependency logic. If someone writes a resolver with the same behaviour as maven2 I'll see about getting it in.

2. Hadoop 2.6 is on curator 2.7.1; HADOOP-11492. To verify it worked against guava 11.02, I ended up compiling curator against that version to see what broke. curator-x-discovery is the only module which doesn't compile against older guava versions (HADOOP-11102)

-Steve

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
FYI I pushed a fix for this to github; so if you pull everything
should work now.

On Thu, Nov 5, 2015 at 12:07 PM, Marcelo Vanzin <va...@cloudera.com> wrote:
> Man that command is slow. Anyway, it seems guava 16 is being brought
> transitively by curator 2.6.0 which should have been overridden by the
> explicit dependency on curator 2.4.0, but apparently, as Steve
> mentioned, sbt/ivy decided to break things, so I'll be adding some
> exclusions.
>
> On Thu, Nov 5, 2015 at 11:55 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
>> Answering my own question: "dependency-graph"
>>
>> On Thu, Nov 5, 2015 at 11:44 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
>>> Does anyone know how to get something similar to "mvn dependency:tree" from sbt?
>>>
>>> mvn dependency:tree with hadoop 2.6.0 does not show any instances of guava 16...
>>>
>>> On Thu, Nov 5, 2015 at 11:37 AM, Ted Yu <yu...@gmail.com> wrote:
>>>> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>>> -Dhadoop.version=2.6.0 -DskipTests assembly
>>>>
>>>> The above command fails on Mac.
>>>>
>>>> build/sbt -Pyarn -Phadoop-2.2 -Phive -Phive-thriftserver -Pkinesis-asl
>>>> -DskipTests assembly
>>>>
>>>> The above command, used by Jenkins, passes.
>>>> That's why the build error wasn't caught.
>>>>
>>>> FYI
>>>>
>>>> On Thu, Nov 5, 2015 at 11:07 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>>>>>
>>>>> Hello Ted,
>>>>>
>>>>> Thanks for your response.
>>>>>
>>>>> Here is the command i used :
>>>>>
>>>>> build/sbt clean
>>>>> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>>>> -Dhadoop.version=2.6.0 -DskipTests assembly
>>>>>
>>>>> I am building on CentOS and on master branch.
>>>>>
>>>>> One other thing, i was able to build fine with the above command up until
>>>>> recently. I think i have stared
>>>>> to have problem after SPARK-11073 where the HashCodes import was added.
>>>>>
>>>>> Regards,
>>>>> Dilip Biswal
>>>>> Tel: 408-463-4980
>>>>> dbiswal@us.ibm.com
>>>>>
>>>>>
>>>>>
>>>>> From:        Ted Yu <yu...@gmail.com>
>>>>> To:        Dilip Biswal/Oakland/IBM@IBMUS
>>>>> Cc:        Jean-Baptiste Onofré <jb...@nanthrax.net>, "dev@spark.apache.org"
>>>>> <de...@spark.apache.org>
>>>>> Date:        11/05/2015 10:46 AM
>>>>> Subject:        Re: Master build fails ?
>>>>> ________________________________
>>>>>
>>>>>
>>>>>
>>>>> Dilip:
>>>>> Can you give the command you used ?
>>>>>
>>>>> Which release were you building ?
>>>>> What OS did you build on ?
>>>>>
>>>>> Cheers
>>>>>
>>>>> On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>>>>> Hello,
>>>>>
>>>>> I am getting the same build error about not being able tofind
>>>>> com.google.common.hash.HashCodes.
>>>>>
>>>>>
>>>>> Is there a solution to this ?
>>>>>
>>>>> Regards,
>>>>> Dilip Biswal
>>>>> Tel: 408-463-4980
>>>>> dbiswal@us.ibm.com
>>>>>
>>>>>
>>>>>
>>>>> From:        Jean-Baptiste Onofré <jb...@nanthrax.net>
>>>>> To:        Ted Yu <yu...@gmail.com>
>>>>> Cc:        "dev@spark.apache.org" <de...@spark.apache.org>
>>>>> Date:        11/03/2015 07:20 AM
>>>>> Subject:        Re: Master build fails ?
>>>>> ________________________________
>>>>>
>>>>>
>>>>>
>>>>> Hi Ted,
>>>>>
>>>>> thanks for the update. The build with sbt is in progress on my box.
>>>>>
>>>>> Regards
>>>>> JB
>>>>>
>>>>> On 11/03/2015 03:31 PM, Ted Yu wrote:
>>>>> > Interesting, Sbt builds were not all failing:
>>>>> >
>>>>> > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
>>>>> >
>>>>> > FYI
>>>>> >
>>>>> > On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
>>>>> > <ma...@nanthrax.net>> wrote:
>>>>>
>>>>> >
>>>>> >     Hi Jacek,
>>>>> >
>>>>> >     it works fine with mvn: the problem is with sbt.
>>>>> >
>>>>> >     I suspect a different reactor order in sbt compare to mvn.
>>>>> >
>>>>> >     Regards
>>>>> >     JB
>>>>> >
>>>>> >     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>>>>> >
>>>>> >         Hi,
>>>>> >
>>>>> >         Just built the sources using the following command and it worked
>>>>> >         fine.
>>>>> >
>>>>> >         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>>>>> >         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>>>>> >         -DskipTests clean install
>>>>> >         ...
>>>>> >         [INFO]
>>>>> >
>>>>> > ------------------------------------------------------------------------
>>>>> >         [INFO] BUILD SUCCESS
>>>>> >         [INFO]
>>>>> >
>>>>> > ------------------------------------------------------------------------
>>>>> >         [INFO] Total time: 14:15 min
>>>>> >         [INFO] Finished at: 2015-11-03T14:40:40+01:00
>>>>> >         [INFO] Final Memory: 438M/1972M
>>>>> >         [INFO]
>>>>> >
>>>>> > ------------------------------------------------------------------------
>>>>> >
>>>>> >         ➜  spark git:(master) ✗ java -version
>>>>> >         java version "1.8.0_66"
>>>>> >         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>>>>> >         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>>>>> >
>>>>> >         I'm on Mac OS.
>>>>> >
>>>>> >         Pozdrawiam,
>>>>> >         Jacek
>>>>> >
>>>>> >         --
>>>>> >         Jacek Laskowski |
>>>>> http://blog.japila.pl|
>>>>> >         http://blog.jaceklaskowski.pl
>>>>>
>>>>> >         Follow me at https://twitter.com/jaceklaskowski
>>>>> >         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>>>>>
>>>>> >
>>>>> >
>>>>> >         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
>>>>> >         <jb...@nanthrax.net>> wrote:
>>>>> >
>>>>> >             Thanks for the update, I used mvn to build but without hive
>>>>> >             profile.
>>>>> >
>>>>> >             Let me try with mvn with the same options as you and sbt
>>>>> > also.
>>>>> >
>>>>> >             I keep you posted.
>>>>> >
>>>>> >             Regards
>>>>> >             JB
>>>>> >
>>>>> >             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>>>> >
>>>>> >
>>>>> >                 I found it is due to SPARK-11073.
>>>>> >
>>>>> >                 Here's the command I used to build
>>>>> >
>>>>> >                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
>>>>> >                 -Phive-thriftserver
>>>>> >                 -Psparkr
>>>>> >
>>>>> >                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
>>>>> >                 <jb...@nanthrax.net>
>>>>> >                 <ma...@nanthrax.net>>> wrote:
>>>>>
>>>>> >
>>>>> >                       Hi Jeff,
>>>>> >
>>>>> >                       it works for me (with skipping the tests).
>>>>> >
>>>>> >                       Let me try again, just to be sure.
>>>>> >
>>>>> >                       Regards
>>>>> >                       JB
>>>>> >
>>>>> >
>>>>> >                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>>>> >
>>>>> >                           Looks like it's due to guava version
>>>>> >                 conflicts, I see both guava
>>>>> >                           14.0.1
>>>>> >                           and 16.0.1 under lib_managed/bundles. Anyone
>>>>> >                 meet this issue too ?
>>>>> >
>>>>> >                           [error]
>>>>> >
>>>>> >
>>>>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>>>> >                           object HashCodes is not a member of package
>>>>> >                 com.google.common.hash
>>>>> >                           [error] import
>>>>> > com.google.common.hash.HashCodes
>>>>> >                           [error]        ^
>>>>> >                           [info] Resolving
>>>>> >                 org.apache.commons#commons-math;2.2 ...
>>>>> >                           [error]
>>>>> >
>>>>> >
>>>>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>>>> >                           not found: value HashCodes
>>>>> >                           [error]         val cookie =
>>>>> >                 HashCodes.fromBytes(secret).toString()
>>>>> >                           [error]                      ^
>>>>> >
>>>>> >
>>>>> >
>>>>> >
>>>>> >                           --
>>>>> >                           Best Regards
>>>>> >
>>>>> >                           Jeff Zhang
>>>>> >
>>>>> >
>>>>> >                       --
>>>>> >                       Jean-Baptiste Onofré
>>>>> >                 jbonofre@apache.org<
>>>>> mailto:jbonofre@apache.org>
>>>>> >                 <ma...@apache.org>>
>>>>> >                 http://blog.nanthrax.net
>>>>> >                       Talend - http://www.talend.com
>>>>> >
>>>>> >
>>>>> >
>>>>> > ---------------------------------------------------------------------
>>>>> >                       To unsubscribe, e-mail:
>>>>> >                 dev-unsubscribe@spark.apache.org
>>>>> >                 <ma...@spark.apache.org>
>>>>> >                       <mailto:dev-unsubscribe@spark.apache.org
>>>>> >                 <ma...@spark.apache.org>>
>>>>> >                       For additional commands, e-mail:
>>>>> >
>>>>> > dev-help@spark.apache.org<ma...@spark.apache.org>
>>>>> >                       <mailto:dev-help@spark.apache.org
>>>>>
>>>>> >                 <ma...@spark.apache.org>>
>>>>> >
>>>>> >
>>>>> >
>>>>> >
>>>>> >                 --
>>>>> >                 Best Regards
>>>>> >
>>>>> >                 Jeff Zhang
>>>>> >
>>>>> >
>>>>> >
>>>>> >             --
>>>>> >             Jean-Baptiste Onofré
>>>>> >             jbonofre@apache.org<ma...@apache.org>
>>>>> >             http://blog.nanthrax.net
>>>>> >             Talend - http://www.talend.com
>>>>> >
>>>>> >
>>>>> > ---------------------------------------------------------------------
>>>>> >             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>> >             <ma...@spark.apache.org>
>>>>> >             For additional commands, e-mail: dev-help@spark.apache.org
>>>>> >             <ma...@spark.apache.org>
>>>>> >
>>>>> >
>>>>> >
>>>>> > ---------------------------------------------------------------------
>>>>> >         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>> >         <ma...@spark.apache.org>
>>>>> >         For additional commands, e-mail: dev-help@spark.apache.org
>>>>> >         <ma...@spark.apache.org>
>>>>> >
>>>>> >
>>>>> >     --
>>>>> >     Jean-Baptiste Onofré
>>>>> >     jbonofre@apache.org<ma...@apache.org>
>>>>> >     http://blog.nanthrax.net
>>>>> >     Talend - http://www.talend.com
>>>>> >
>>>>> >
>>>>> > ---------------------------------------------------------------------
>>>>> >     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>> >     <ma...@spark.apache.org>
>>>>> >     For additional commands, e-mail: dev-help@spark.apache.org
>>>>> >     <ma...@spark.apache.org>
>>>>> >
>>>>> >
>>>>>
>>>>> --
>>>>> Jean-Baptiste Onofré
>>>>> jbonofre@apache.org
>>>>> http://blog.nanthrax.net
>>>>> Talend - http://www.talend.com
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>>
>>> --
>>> Marcelo
>>
>>
>>
>> --
>> Marcelo
>
>
>
> --
> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Man that command is slow. Anyway, it seems guava 16 is being brought
transitively by curator 2.6.0 which should have been overridden by the
explicit dependency on curator 2.4.0, but apparently, as Steve
mentioned, sbt/ivy decided to break things, so I'll be adding some
exclusions.

On Thu, Nov 5, 2015 at 11:55 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
> Answering my own question: "dependency-graph"
>
> On Thu, Nov 5, 2015 at 11:44 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
>> Does anyone know how to get something similar to "mvn dependency:tree" from sbt?
>>
>> mvn dependency:tree with hadoop 2.6.0 does not show any instances of guava 16...
>>
>> On Thu, Nov 5, 2015 at 11:37 AM, Ted Yu <yu...@gmail.com> wrote:
>>> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>> -Dhadoop.version=2.6.0 -DskipTests assembly
>>>
>>> The above command fails on Mac.
>>>
>>> build/sbt -Pyarn -Phadoop-2.2 -Phive -Phive-thriftserver -Pkinesis-asl
>>> -DskipTests assembly
>>>
>>> The above command, used by Jenkins, passes.
>>> That's why the build error wasn't caught.
>>>
>>> FYI
>>>
>>> On Thu, Nov 5, 2015 at 11:07 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>>>>
>>>> Hello Ted,
>>>>
>>>> Thanks for your response.
>>>>
>>>> Here is the command i used :
>>>>
>>>> build/sbt clean
>>>> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>>> -Dhadoop.version=2.6.0 -DskipTests assembly
>>>>
>>>> I am building on CentOS and on master branch.
>>>>
>>>> One other thing, i was able to build fine with the above command up until
>>>> recently. I think i have stared
>>>> to have problem after SPARK-11073 where the HashCodes import was added.
>>>>
>>>> Regards,
>>>> Dilip Biswal
>>>> Tel: 408-463-4980
>>>> dbiswal@us.ibm.com
>>>>
>>>>
>>>>
>>>> From:        Ted Yu <yu...@gmail.com>
>>>> To:        Dilip Biswal/Oakland/IBM@IBMUS
>>>> Cc:        Jean-Baptiste Onofré <jb...@nanthrax.net>, "dev@spark.apache.org"
>>>> <de...@spark.apache.org>
>>>> Date:        11/05/2015 10:46 AM
>>>> Subject:        Re: Master build fails ?
>>>> ________________________________
>>>>
>>>>
>>>>
>>>> Dilip:
>>>> Can you give the command you used ?
>>>>
>>>> Which release were you building ?
>>>> What OS did you build on ?
>>>>
>>>> Cheers
>>>>
>>>> On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>>>> Hello,
>>>>
>>>> I am getting the same build error about not being able tofind
>>>> com.google.common.hash.HashCodes.
>>>>
>>>>
>>>> Is there a solution to this ?
>>>>
>>>> Regards,
>>>> Dilip Biswal
>>>> Tel: 408-463-4980
>>>> dbiswal@us.ibm.com
>>>>
>>>>
>>>>
>>>> From:        Jean-Baptiste Onofré <jb...@nanthrax.net>
>>>> To:        Ted Yu <yu...@gmail.com>
>>>> Cc:        "dev@spark.apache.org" <de...@spark.apache.org>
>>>> Date:        11/03/2015 07:20 AM
>>>> Subject:        Re: Master build fails ?
>>>> ________________________________
>>>>
>>>>
>>>>
>>>> Hi Ted,
>>>>
>>>> thanks for the update. The build with sbt is in progress on my box.
>>>>
>>>> Regards
>>>> JB
>>>>
>>>> On 11/03/2015 03:31 PM, Ted Yu wrote:
>>>> > Interesting, Sbt builds were not all failing:
>>>> >
>>>> > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
>>>> >
>>>> > FYI
>>>> >
>>>> > On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
>>>> > <ma...@nanthrax.net>> wrote:
>>>>
>>>> >
>>>> >     Hi Jacek,
>>>> >
>>>> >     it works fine with mvn: the problem is with sbt.
>>>> >
>>>> >     I suspect a different reactor order in sbt compare to mvn.
>>>> >
>>>> >     Regards
>>>> >     JB
>>>> >
>>>> >     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>>>> >
>>>> >         Hi,
>>>> >
>>>> >         Just built the sources using the following command and it worked
>>>> >         fine.
>>>> >
>>>> >         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>>>> >         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>>>> >         -DskipTests clean install
>>>> >         ...
>>>> >         [INFO]
>>>> >
>>>> > ------------------------------------------------------------------------
>>>> >         [INFO] BUILD SUCCESS
>>>> >         [INFO]
>>>> >
>>>> > ------------------------------------------------------------------------
>>>> >         [INFO] Total time: 14:15 min
>>>> >         [INFO] Finished at: 2015-11-03T14:40:40+01:00
>>>> >         [INFO] Final Memory: 438M/1972M
>>>> >         [INFO]
>>>> >
>>>> > ------------------------------------------------------------------------
>>>> >
>>>> >         ➜  spark git:(master) ✗ java -version
>>>> >         java version "1.8.0_66"
>>>> >         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>>>> >         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>>>> >
>>>> >         I'm on Mac OS.
>>>> >
>>>> >         Pozdrawiam,
>>>> >         Jacek
>>>> >
>>>> >         --
>>>> >         Jacek Laskowski |
>>>> http://blog.japila.pl|
>>>> >         http://blog.jaceklaskowski.pl
>>>>
>>>> >         Follow me at https://twitter.com/jaceklaskowski
>>>> >         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>>>>
>>>> >
>>>> >
>>>> >         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
>>>> >         <jb...@nanthrax.net>> wrote:
>>>> >
>>>> >             Thanks for the update, I used mvn to build but without hive
>>>> >             profile.
>>>> >
>>>> >             Let me try with mvn with the same options as you and sbt
>>>> > also.
>>>> >
>>>> >             I keep you posted.
>>>> >
>>>> >             Regards
>>>> >             JB
>>>> >
>>>> >             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>>> >
>>>> >
>>>> >                 I found it is due to SPARK-11073.
>>>> >
>>>> >                 Here's the command I used to build
>>>> >
>>>> >                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
>>>> >                 -Phive-thriftserver
>>>> >                 -Psparkr
>>>> >
>>>> >                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
>>>> >                 <jb...@nanthrax.net>
>>>> >                 <ma...@nanthrax.net>>> wrote:
>>>>
>>>> >
>>>> >                       Hi Jeff,
>>>> >
>>>> >                       it works for me (with skipping the tests).
>>>> >
>>>> >                       Let me try again, just to be sure.
>>>> >
>>>> >                       Regards
>>>> >                       JB
>>>> >
>>>> >
>>>> >                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>>> >
>>>> >                           Looks like it's due to guava version
>>>> >                 conflicts, I see both guava
>>>> >                           14.0.1
>>>> >                           and 16.0.1 under lib_managed/bundles. Anyone
>>>> >                 meet this issue too ?
>>>> >
>>>> >                           [error]
>>>> >
>>>> >
>>>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>>> >                           object HashCodes is not a member of package
>>>> >                 com.google.common.hash
>>>> >                           [error] import
>>>> > com.google.common.hash.HashCodes
>>>> >                           [error]        ^
>>>> >                           [info] Resolving
>>>> >                 org.apache.commons#commons-math;2.2 ...
>>>> >                           [error]
>>>> >
>>>> >
>>>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>>> >                           not found: value HashCodes
>>>> >                           [error]         val cookie =
>>>> >                 HashCodes.fromBytes(secret).toString()
>>>> >                           [error]                      ^
>>>> >
>>>> >
>>>> >
>>>> >
>>>> >                           --
>>>> >                           Best Regards
>>>> >
>>>> >                           Jeff Zhang
>>>> >
>>>> >
>>>> >                       --
>>>> >                       Jean-Baptiste Onofré
>>>> >                 jbonofre@apache.org<
>>>> mailto:jbonofre@apache.org>
>>>> >                 <ma...@apache.org>>
>>>> >                 http://blog.nanthrax.net
>>>> >                       Talend - http://www.talend.com
>>>> >
>>>> >
>>>> >
>>>> > ---------------------------------------------------------------------
>>>> >                       To unsubscribe, e-mail:
>>>> >                 dev-unsubscribe@spark.apache.org
>>>> >                 <ma...@spark.apache.org>
>>>> >                       <mailto:dev-unsubscribe@spark.apache.org
>>>> >                 <ma...@spark.apache.org>>
>>>> >                       For additional commands, e-mail:
>>>> >
>>>> > dev-help@spark.apache.org<ma...@spark.apache.org>
>>>> >                       <mailto:dev-help@spark.apache.org
>>>>
>>>> >                 <ma...@spark.apache.org>>
>>>> >
>>>> >
>>>> >
>>>> >
>>>> >                 --
>>>> >                 Best Regards
>>>> >
>>>> >                 Jeff Zhang
>>>> >
>>>> >
>>>> >
>>>> >             --
>>>> >             Jean-Baptiste Onofré
>>>> >             jbonofre@apache.org<ma...@apache.org>
>>>> >             http://blog.nanthrax.net
>>>> >             Talend - http://www.talend.com
>>>> >
>>>> >
>>>> > ---------------------------------------------------------------------
>>>> >             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> >             <ma...@spark.apache.org>
>>>> >             For additional commands, e-mail: dev-help@spark.apache.org
>>>> >             <ma...@spark.apache.org>
>>>> >
>>>> >
>>>> >
>>>> > ---------------------------------------------------------------------
>>>> >         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> >         <ma...@spark.apache.org>
>>>> >         For additional commands, e-mail: dev-help@spark.apache.org
>>>> >         <ma...@spark.apache.org>
>>>> >
>>>> >
>>>> >     --
>>>> >     Jean-Baptiste Onofré
>>>> >     jbonofre@apache.org<ma...@apache.org>
>>>> >     http://blog.nanthrax.net
>>>> >     Talend - http://www.talend.com
>>>> >
>>>> >
>>>> > ---------------------------------------------------------------------
>>>> >     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> >     <ma...@spark.apache.org>
>>>> >     For additional commands, e-mail: dev-help@spark.apache.org
>>>> >     <ma...@spark.apache.org>
>>>> >
>>>> >
>>>>
>>>> --
>>>> Jean-Baptiste Onofré
>>>> jbonofre@apache.org
>>>> http://blog.nanthrax.net
>>>> Talend - http://www.talend.com
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>>
>>
>> --
>> Marcelo
>
>
>
> --
> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Answering my own question: "dependency-graph"

On Thu, Nov 5, 2015 at 11:44 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
> Does anyone know how to get something similar to "mvn dependency:tree" from sbt?
>
> mvn dependency:tree with hadoop 2.6.0 does not show any instances of guava 16...
>
> On Thu, Nov 5, 2015 at 11:37 AM, Ted Yu <yu...@gmail.com> wrote:
>> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>> -Dhadoop.version=2.6.0 -DskipTests assembly
>>
>> The above command fails on Mac.
>>
>> build/sbt -Pyarn -Phadoop-2.2 -Phive -Phive-thriftserver -Pkinesis-asl
>> -DskipTests assembly
>>
>> The above command, used by Jenkins, passes.
>> That's why the build error wasn't caught.
>>
>> FYI
>>
>> On Thu, Nov 5, 2015 at 11:07 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>>>
>>> Hello Ted,
>>>
>>> Thanks for your response.
>>>
>>> Here is the command i used :
>>>
>>> build/sbt clean
>>> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>> -Dhadoop.version=2.6.0 -DskipTests assembly
>>>
>>> I am building on CentOS and on master branch.
>>>
>>> One other thing, i was able to build fine with the above command up until
>>> recently. I think i have stared
>>> to have problem after SPARK-11073 where the HashCodes import was added.
>>>
>>> Regards,
>>> Dilip Biswal
>>> Tel: 408-463-4980
>>> dbiswal@us.ibm.com
>>>
>>>
>>>
>>> From:        Ted Yu <yu...@gmail.com>
>>> To:        Dilip Biswal/Oakland/IBM@IBMUS
>>> Cc:        Jean-Baptiste Onofré <jb...@nanthrax.net>, "dev@spark.apache.org"
>>> <de...@spark.apache.org>
>>> Date:        11/05/2015 10:46 AM
>>> Subject:        Re: Master build fails ?
>>> ________________________________
>>>
>>>
>>>
>>> Dilip:
>>> Can you give the command you used ?
>>>
>>> Which release were you building ?
>>> What OS did you build on ?
>>>
>>> Cheers
>>>
>>> On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>>> Hello,
>>>
>>> I am getting the same build error about not being able tofind
>>> com.google.common.hash.HashCodes.
>>>
>>>
>>> Is there a solution to this ?
>>>
>>> Regards,
>>> Dilip Biswal
>>> Tel: 408-463-4980
>>> dbiswal@us.ibm.com
>>>
>>>
>>>
>>> From:        Jean-Baptiste Onofré <jb...@nanthrax.net>
>>> To:        Ted Yu <yu...@gmail.com>
>>> Cc:        "dev@spark.apache.org" <de...@spark.apache.org>
>>> Date:        11/03/2015 07:20 AM
>>> Subject:        Re: Master build fails ?
>>> ________________________________
>>>
>>>
>>>
>>> Hi Ted,
>>>
>>> thanks for the update. The build with sbt is in progress on my box.
>>>
>>> Regards
>>> JB
>>>
>>> On 11/03/2015 03:31 PM, Ted Yu wrote:
>>> > Interesting, Sbt builds were not all failing:
>>> >
>>> > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
>>> >
>>> > FYI
>>> >
>>> > On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
>>> > <ma...@nanthrax.net>> wrote:
>>>
>>> >
>>> >     Hi Jacek,
>>> >
>>> >     it works fine with mvn: the problem is with sbt.
>>> >
>>> >     I suspect a different reactor order in sbt compare to mvn.
>>> >
>>> >     Regards
>>> >     JB
>>> >
>>> >     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>>> >
>>> >         Hi,
>>> >
>>> >         Just built the sources using the following command and it worked
>>> >         fine.
>>> >
>>> >         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>>> >         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>>> >         -DskipTests clean install
>>> >         ...
>>> >         [INFO]
>>> >
>>> > ------------------------------------------------------------------------
>>> >         [INFO] BUILD SUCCESS
>>> >         [INFO]
>>> >
>>> > ------------------------------------------------------------------------
>>> >         [INFO] Total time: 14:15 min
>>> >         [INFO] Finished at: 2015-11-03T14:40:40+01:00
>>> >         [INFO] Final Memory: 438M/1972M
>>> >         [INFO]
>>> >
>>> > ------------------------------------------------------------------------
>>> >
>>> >         ➜  spark git:(master) ✗ java -version
>>> >         java version "1.8.0_66"
>>> >         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>>> >         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>>> >
>>> >         I'm on Mac OS.
>>> >
>>> >         Pozdrawiam,
>>> >         Jacek
>>> >
>>> >         --
>>> >         Jacek Laskowski |
>>> http://blog.japila.pl|
>>> >         http://blog.jaceklaskowski.pl
>>>
>>> >         Follow me at https://twitter.com/jaceklaskowski
>>> >         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>>>
>>> >
>>> >
>>> >         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
>>> >         <jb...@nanthrax.net>> wrote:
>>> >
>>> >             Thanks for the update, I used mvn to build but without hive
>>> >             profile.
>>> >
>>> >             Let me try with mvn with the same options as you and sbt
>>> > also.
>>> >
>>> >             I keep you posted.
>>> >
>>> >             Regards
>>> >             JB
>>> >
>>> >             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>> >
>>> >
>>> >                 I found it is due to SPARK-11073.
>>> >
>>> >                 Here's the command I used to build
>>> >
>>> >                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
>>> >                 -Phive-thriftserver
>>> >                 -Psparkr
>>> >
>>> >                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
>>> >                 <jb...@nanthrax.net>
>>> >                 <ma...@nanthrax.net>>> wrote:
>>>
>>> >
>>> >                       Hi Jeff,
>>> >
>>> >                       it works for me (with skipping the tests).
>>> >
>>> >                       Let me try again, just to be sure.
>>> >
>>> >                       Regards
>>> >                       JB
>>> >
>>> >
>>> >                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>> >
>>> >                           Looks like it's due to guava version
>>> >                 conflicts, I see both guava
>>> >                           14.0.1
>>> >                           and 16.0.1 under lib_managed/bundles. Anyone
>>> >                 meet this issue too ?
>>> >
>>> >                           [error]
>>> >
>>> >
>>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>> >                           object HashCodes is not a member of package
>>> >                 com.google.common.hash
>>> >                           [error] import
>>> > com.google.common.hash.HashCodes
>>> >                           [error]        ^
>>> >                           [info] Resolving
>>> >                 org.apache.commons#commons-math;2.2 ...
>>> >                           [error]
>>> >
>>> >
>>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>> >                           not found: value HashCodes
>>> >                           [error]         val cookie =
>>> >                 HashCodes.fromBytes(secret).toString()
>>> >                           [error]                      ^
>>> >
>>> >
>>> >
>>> >
>>> >                           --
>>> >                           Best Regards
>>> >
>>> >                           Jeff Zhang
>>> >
>>> >
>>> >                       --
>>> >                       Jean-Baptiste Onofré
>>> >                 jbonofre@apache.org<
>>> mailto:jbonofre@apache.org>
>>> >                 <ma...@apache.org>>
>>> >                 http://blog.nanthrax.net
>>> >                       Talend - http://www.talend.com
>>> >
>>> >
>>> >
>>> > ---------------------------------------------------------------------
>>> >                       To unsubscribe, e-mail:
>>> >                 dev-unsubscribe@spark.apache.org
>>> >                 <ma...@spark.apache.org>
>>> >                       <mailto:dev-unsubscribe@spark.apache.org
>>> >                 <ma...@spark.apache.org>>
>>> >                       For additional commands, e-mail:
>>> >
>>> > dev-help@spark.apache.org<ma...@spark.apache.org>
>>> >                       <mailto:dev-help@spark.apache.org
>>>
>>> >                 <ma...@spark.apache.org>>
>>> >
>>> >
>>> >
>>> >
>>> >                 --
>>> >                 Best Regards
>>> >
>>> >                 Jeff Zhang
>>> >
>>> >
>>> >
>>> >             --
>>> >             Jean-Baptiste Onofré
>>> >             jbonofre@apache.org<ma...@apache.org>
>>> >             http://blog.nanthrax.net
>>> >             Talend - http://www.talend.com
>>> >
>>> >
>>> > ---------------------------------------------------------------------
>>> >             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> >             <ma...@spark.apache.org>
>>> >             For additional commands, e-mail: dev-help@spark.apache.org
>>> >             <ma...@spark.apache.org>
>>> >
>>> >
>>> >
>>> > ---------------------------------------------------------------------
>>> >         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> >         <ma...@spark.apache.org>
>>> >         For additional commands, e-mail: dev-help@spark.apache.org
>>> >         <ma...@spark.apache.org>
>>> >
>>> >
>>> >     --
>>> >     Jean-Baptiste Onofré
>>> >     jbonofre@apache.org<ma...@apache.org>
>>> >     http://blog.nanthrax.net
>>> >     Talend - http://www.talend.com
>>> >
>>> >
>>> > ---------------------------------------------------------------------
>>> >     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> >     <ma...@spark.apache.org>
>>> >     For additional commands, e-mail: dev-help@spark.apache.org
>>> >     <ma...@spark.apache.org>
>>> >
>>> >
>>>
>>> --
>>> Jean-Baptiste Onofré
>>> jbonofre@apache.org
>>> http://blog.nanthrax.net
>>> Talend - http://www.talend.com
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>>
>>>
>>>
>>
>
>
>
> --
> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Does anyone know how to get something similar to "mvn dependency:tree" from sbt?

mvn dependency:tree with hadoop 2.6.0 does not show any instances of guava 16...

On Thu, Nov 5, 2015 at 11:37 AM, Ted Yu <yu...@gmail.com> wrote:
> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
> -Dhadoop.version=2.6.0 -DskipTests assembly
>
> The above command fails on Mac.
>
> build/sbt -Pyarn -Phadoop-2.2 -Phive -Phive-thriftserver -Pkinesis-asl
> -DskipTests assembly
>
> The above command, used by Jenkins, passes.
> That's why the build error wasn't caught.
>
> FYI
>
> On Thu, Nov 5, 2015 at 11:07 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>>
>> Hello Ted,
>>
>> Thanks for your response.
>>
>> Here is the command i used :
>>
>> build/sbt clean
>> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>> -Dhadoop.version=2.6.0 -DskipTests assembly
>>
>> I am building on CentOS and on master branch.
>>
>> One other thing, i was able to build fine with the above command up until
>> recently. I think i have stared
>> to have problem after SPARK-11073 where the HashCodes import was added.
>>
>> Regards,
>> Dilip Biswal
>> Tel: 408-463-4980
>> dbiswal@us.ibm.com
>>
>>
>>
>> From:        Ted Yu <yu...@gmail.com>
>> To:        Dilip Biswal/Oakland/IBM@IBMUS
>> Cc:        Jean-Baptiste Onofré <jb...@nanthrax.net>, "dev@spark.apache.org"
>> <de...@spark.apache.org>
>> Date:        11/05/2015 10:46 AM
>> Subject:        Re: Master build fails ?
>> ________________________________
>>
>>
>>
>> Dilip:
>> Can you give the command you used ?
>>
>> Which release were you building ?
>> What OS did you build on ?
>>
>> Cheers
>>
>> On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <db...@us.ibm.com> wrote:
>> Hello,
>>
>> I am getting the same build error about not being able tofind
>> com.google.common.hash.HashCodes.
>>
>>
>> Is there a solution to this ?
>>
>> Regards,
>> Dilip Biswal
>> Tel: 408-463-4980
>> dbiswal@us.ibm.com
>>
>>
>>
>> From:        Jean-Baptiste Onofré <jb...@nanthrax.net>
>> To:        Ted Yu <yu...@gmail.com>
>> Cc:        "dev@spark.apache.org" <de...@spark.apache.org>
>> Date:        11/03/2015 07:20 AM
>> Subject:        Re: Master build fails ?
>> ________________________________
>>
>>
>>
>> Hi Ted,
>>
>> thanks for the update. The build with sbt is in progress on my box.
>>
>> Regards
>> JB
>>
>> On 11/03/2015 03:31 PM, Ted Yu wrote:
>> > Interesting, Sbt builds were not all failing:
>> >
>> > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
>> >
>> > FYI
>> >
>> > On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
>> > <ma...@nanthrax.net>> wrote:
>>
>> >
>> >     Hi Jacek,
>> >
>> >     it works fine with mvn: the problem is with sbt.
>> >
>> >     I suspect a different reactor order in sbt compare to mvn.
>> >
>> >     Regards
>> >     JB
>> >
>> >     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>> >
>> >         Hi,
>> >
>> >         Just built the sources using the following command and it worked
>> >         fine.
>> >
>> >         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>> >         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>> >         -DskipTests clean install
>> >         ...
>> >         [INFO]
>> >
>> > ------------------------------------------------------------------------
>> >         [INFO] BUILD SUCCESS
>> >         [INFO]
>> >
>> > ------------------------------------------------------------------------
>> >         [INFO] Total time: 14:15 min
>> >         [INFO] Finished at: 2015-11-03T14:40:40+01:00
>> >         [INFO] Final Memory: 438M/1972M
>> >         [INFO]
>> >
>> > ------------------------------------------------------------------------
>> >
>> >         ➜  spark git:(master) ✗ java -version
>> >         java version "1.8.0_66"
>> >         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>> >         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>> >
>> >         I'm on Mac OS.
>> >
>> >         Pozdrawiam,
>> >         Jacek
>> >
>> >         --
>> >         Jacek Laskowski |
>> http://blog.japila.pl|
>> >         http://blog.jaceklaskowski.pl
>>
>> >         Follow me at https://twitter.com/jaceklaskowski
>> >         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>>
>> >
>> >
>> >         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
>> >         <jb...@nanthrax.net>> wrote:
>> >
>> >             Thanks for the update, I used mvn to build but without hive
>> >             profile.
>> >
>> >             Let me try with mvn with the same options as you and sbt
>> > also.
>> >
>> >             I keep you posted.
>> >
>> >             Regards
>> >             JB
>> >
>> >             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>> >
>> >
>> >                 I found it is due to SPARK-11073.
>> >
>> >                 Here's the command I used to build
>> >
>> >                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
>> >                 -Phive-thriftserver
>> >                 -Psparkr
>> >
>> >                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
>> >                 <jb...@nanthrax.net>
>> >                 <ma...@nanthrax.net>>> wrote:
>>
>> >
>> >                       Hi Jeff,
>> >
>> >                       it works for me (with skipping the tests).
>> >
>> >                       Let me try again, just to be sure.
>> >
>> >                       Regards
>> >                       JB
>> >
>> >
>> >                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>> >
>> >                           Looks like it's due to guava version
>> >                 conflicts, I see both guava
>> >                           14.0.1
>> >                           and 16.0.1 under lib_managed/bundles. Anyone
>> >                 meet this issue too ?
>> >
>> >                           [error]
>> >
>> >
>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>> >                           object HashCodes is not a member of package
>> >                 com.google.common.hash
>> >                           [error] import
>> > com.google.common.hash.HashCodes
>> >                           [error]        ^
>> >                           [info] Resolving
>> >                 org.apache.commons#commons-math;2.2 ...
>> >                           [error]
>> >
>> >
>> > /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>> >                           not found: value HashCodes
>> >                           [error]         val cookie =
>> >                 HashCodes.fromBytes(secret).toString()
>> >                           [error]                      ^
>> >
>> >
>> >
>> >
>> >                           --
>> >                           Best Regards
>> >
>> >                           Jeff Zhang
>> >
>> >
>> >                       --
>> >                       Jean-Baptiste Onofré
>> >                 jbonofre@apache.org<
>> mailto:jbonofre@apache.org>
>> >                 <ma...@apache.org>>
>> >                 http://blog.nanthrax.net
>> >                       Talend - http://www.talend.com
>> >
>> >
>> >
>> > ---------------------------------------------------------------------
>> >                       To unsubscribe, e-mail:
>> >                 dev-unsubscribe@spark.apache.org
>> >                 <ma...@spark.apache.org>
>> >                       <mailto:dev-unsubscribe@spark.apache.org
>> >                 <ma...@spark.apache.org>>
>> >                       For additional commands, e-mail:
>> >
>> > dev-help@spark.apache.org<ma...@spark.apache.org>
>> >                       <mailto:dev-help@spark.apache.org
>>
>> >                 <ma...@spark.apache.org>>
>> >
>> >
>> >
>> >
>> >                 --
>> >                 Best Regards
>> >
>> >                 Jeff Zhang
>> >
>> >
>> >
>> >             --
>> >             Jean-Baptiste Onofré
>> >             jbonofre@apache.org<ma...@apache.org>
>> >             http://blog.nanthrax.net
>> >             Talend - http://www.talend.com
>> >
>> >
>> > ---------------------------------------------------------------------
>> >             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> >             <ma...@spark.apache.org>
>> >             For additional commands, e-mail: dev-help@spark.apache.org
>> >             <ma...@spark.apache.org>
>> >
>> >
>> >
>> > ---------------------------------------------------------------------
>> >         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> >         <ma...@spark.apache.org>
>> >         For additional commands, e-mail: dev-help@spark.apache.org
>> >         <ma...@spark.apache.org>
>> >
>> >
>> >     --
>> >     Jean-Baptiste Onofré
>> >     jbonofre@apache.org<ma...@apache.org>
>> >     http://blog.nanthrax.net
>> >     Talend - http://www.talend.com
>> >
>> >
>> > ---------------------------------------------------------------------
>> >     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> >     <ma...@spark.apache.org>
>> >     For additional commands, e-mail: dev-help@spark.apache.org
>> >     <ma...@spark.apache.org>
>> >
>> >
>>
>> --
>> Jean-Baptiste Onofré
>> jbonofre@apache.org
>> http://blog.nanthrax.net
>> Talend - http://www.talend.com
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>>
>>
>>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Ted Yu <yu...@gmail.com>.
build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
-Dhadoop.version=2.6.0 -DskipTests assembly

The above command fails on Mac.

build/sbt -Pyarn -Phadoop-2.2 -Phive -Phive-thriftserver -Pkinesis-asl
-DskipTests assembly

The above command, used by Jenkins, passes.
That's why the build error wasn't caught.

FYI

On Thu, Nov 5, 2015 at 11:07 AM, Dilip Biswal <db...@us.ibm.com> wrote:

> Hello Ted,
>
> Thanks for your response.
>
> Here is the command i used :
>
> build/sbt clean
> build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
> -Dhadoop.version=2.6.0 -DskipTests assembly
>
> I am building on CentOS and on master branch.
>
> One other thing, i was able to build fine with the above command up until
> recently. I think i have stared
> to have problem after SPARK-11073 where the HashCodes import was added.
>
> Regards,
> Dilip Biswal
> Tel: 408-463-4980
> dbiswal@us.ibm.com
>
>
>
> From:        Ted Yu <yu...@gmail.com>
> To:        Dilip Biswal/Oakland/IBM@IBMUS
> Cc:        Jean-Baptiste Onofré <jb...@nanthrax.net>, "dev@spark.apache.org"
> <de...@spark.apache.org>
> Date:        11/05/2015 10:46 AM
> Subject:        Re: Master build fails ?
> ------------------------------
>
>
>
> Dilip:
> Can you give the command you used ?
>
> Which release were you building ?
> What OS did you build on ?
>
> Cheers
>
> On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <*dbiswal@us.ibm.com*
> <db...@us.ibm.com>> wrote:
> Hello,
>
> I am getting the same build error about not being able tofind
> com.google.common.hash.HashCodes.
>
>
> Is there a solution to this ?
>
> Regards,
> Dilip Biswal
> Tel: *408-463-4980* <408-463-4980>
> *dbiswal@us.ibm.com* <db...@us.ibm.com>
>
>
>
> From:        Jean-Baptiste Onofré <*jb@nanthrax.net* <jb...@nanthrax.net>>
> To:        Ted Yu <*yuzhihong@gmail.com* <yu...@gmail.com>>
> Cc:        "*dev@spark.apache.org* <de...@spark.apache.org>" <
> *dev@spark.apache.org* <de...@spark.apache.org>>
> Date:        11/03/2015 07:20 AM
> Subject:        Re: Master build fails ?
> ------------------------------
>
>
>
> Hi Ted,
>
> thanks for the update. The build with sbt is in progress on my box.
>
> Regards
> JB
>
> On 11/03/2015 03:31 PM, Ted Yu wrote:
> > Interesting, Sbt builds were not all failing:
> >
> > *https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/*
> <https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/>
> >
> > FYI
> >
> > On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <*jb@nanthrax.net*
> <jb...@nanthrax.net>
> > <*mailto:jb@nanthrax.net* <jb...@nanthrax.net>>> wrote:
>
> >
> >     Hi Jacek,
> >
> >     it works fine with mvn: the problem is with sbt.
> >
> >     I suspect a different reactor order in sbt compare to mvn.
> >
> >     Regards
> >     JB
> >
> >     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
> >
> >         Hi,
> >
> >         Just built the sources using the following command and it worked
> >         fine.
> >
> >         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
> >         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
> >         -DskipTests clean install
> >         ...
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >         [INFO] BUILD SUCCESS
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >         [INFO] Total time: 14:15 min
> >         [INFO] Finished at: 2015-11-03T14:40:40+01:00
> >         [INFO] Final Memory: 438M/1972M
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >
> >         ➜  spark git:(master) ✗ java -version
> >         java version "1.8.0_66"
> >         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
> >         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
> >
> >         I'm on Mac OS.
> >
> >         Pozdrawiam,
> >         Jacek
> >
> >         --
> >         Jacek Laskowski |
> *http://blog.japila.pl* <http://blog.japila.pl/>|
> >         *http://blog.jaceklaskowski.pl* <http://blog.jaceklaskowski.pl/>
>
> >         Follow me at *https://twitter.com/jaceklaskowski*
> <https://twitter.com/jaceklaskowski>
> >         Upvote at
> *http://stackoverflow.com/users/1305344/jacek-laskowski*
> <http://stackoverflow.com/users/1305344/jacek-laskowski>
>
> >
> >
> >         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
> >         <*jb@nanthrax.net* <jb...@nanthrax.net><*mailto:jb@nanthrax.net*
> <jb...@nanthrax.net>>> wrote:
> >
> >             Thanks for the update, I used mvn to build but without hive
> >             profile.
> >
> >             Let me try with mvn with the same options as you and sbt
> also.
> >
> >             I keep you posted.
> >
> >             Regards
> >             JB
> >
> >             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
> >
> >
> >                 I found it is due to SPARK-11073.
> >
> >                 Here's the command I used to build
> >
> >                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
> >                 -Phive-thriftserver
> >                 -Psparkr
> >
> >                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
> >                 <*jb@nanthrax.net* <jb...@nanthrax.net><
> *mailto:jb@nanthrax.net* <jb...@nanthrax.net>>
> >                 <*mailto:jb@nanthrax.net* <jb...@nanthrax.net><
> *mailto:jb@nanthrax.net* <jb...@nanthrax.net>>>> wrote:
>
> >
> >                       Hi Jeff,
> >
> >                       it works for me (with skipping the tests).
> >
> >                       Let me try again, just to be sure.
> >
> >                       Regards
> >                       JB
> >
> >
> >                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
> >
> >                           Looks like it's due to guava version
> >                 conflicts, I see both guava
> >                           14.0.1
> >                           and 16.0.1 under lib_managed/bundles. Anyone
> >                 meet this issue too ?
> >
> >                           [error]
> >
> >
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
> >                           object HashCodes is not a member of package
> >                 com.google.common.hash
> >                           [error] import com.google.common.hash.HashCodes
> >                           [error]        ^
> >                           [info] Resolving
> >                 org.apache.commons#commons-math;2.2 ...
> >                           [error]
> >
> >
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
> >                           not found: value HashCodes
> >                           [error]         val cookie =
> >                 HashCodes.fromBytes(secret).toString()
> >                           [error]                      ^
> >
> >
> >
> >
> >                           --
> >                           Best Regards
> >
> >                           Jeff Zhang
> >
> >
> >                       --
> >                       Jean-Baptiste Onofré
> >                 *jbonofre@apache.org* <jb...@apache.org><
> *mailto:jbonofre@apache.org* <jb...@apache.org>>
> >                 <*mailto:jbonofre@apache.org* <jb...@apache.org><
> *mailto:jbonofre@apache.org* <jb...@apache.org>>>
> >                 *http://blog.nanthrax.net* <http://blog.nanthrax.net/>
> >                       Talend - *http://www.talend.com*
> <http://www.talend.com/>
> >
> >
> >
> ---------------------------------------------------------------------
> >                       To unsubscribe, e-mail:
> >                 *dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>
> >                 <*mailto:dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>>
> >                       <*mailto:dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>
> >                 <*mailto:dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>>>
> >                       For additional commands, e-mail:
> >                 *dev-help@spark.apache.org* <de...@spark.apache.org><
> *mailto:dev-help@spark.apache.org* <de...@spark.apache.org>>
> >                       <*mailto:dev-help@spark.apache.org*
> <de...@spark.apache.org>
>
> >                 <*mailto:dev-help@spark.apache.org*
> <de...@spark.apache.org>>>
> >
> >
> >
> >
> >                 --
> >                 Best Regards
> >
> >                 Jeff Zhang
> >
> >
> >
> >             --
> >             Jean-Baptiste Onofré
> >             *jbonofre@apache.org* <jb...@apache.org><
> *mailto:jbonofre@apache.org* <jb...@apache.org>>
> >             *http://blog.nanthrax.net* <http://blog.nanthrax.net/>
> >             Talend - *http://www.talend.com* <http://www.talend.com/>
> >
> >
> ---------------------------------------------------------------------
> >             To unsubscribe, e-mail: *dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>
> >             <*mailto:dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>>
> >             For additional commands, e-mail: *dev-help@spark.apache.org*
> <de...@spark.apache.org>
> >             <*mailto:dev-help@spark.apache.org*
> <de...@spark.apache.org>>
> >
> >
> >
> ---------------------------------------------------------------------
> >         To unsubscribe, e-mail: *dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>
> >         <*mailto:dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>>
> >         For additional commands, e-mail: *dev-help@spark.apache.org*
> <de...@spark.apache.org>
> >         <*mailto:dev-help@spark.apache.org* <de...@spark.apache.org>>
> >
> >
> >     --
> >     Jean-Baptiste Onofré
> >     *jbonofre@apache.org* <jb...@apache.org><
> *mailto:jbonofre@apache.org* <jb...@apache.org>>
> >     *http://blog.nanthrax.net* <http://blog.nanthrax.net/>
> >     Talend - *http://www.talend.com* <http://www.talend.com/>
> >
> >     ---------------------------------------------------------------------
> >     To unsubscribe, e-mail: *dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>
> >     <*mailto:dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>>
> >     For additional commands, e-mail: *dev-help@spark.apache.org*
> <de...@spark.apache.org>
> >     <*mailto:dev-help@spark.apache.org* <de...@spark.apache.org>>
> >
> >
>
> --
> Jean-Baptiste Onofré
> *jbonofre@apache.org* <jb...@apache.org>
> *http://blog.nanthrax.net* <http://blog.nanthrax.net/>
> Talend - *http://www.talend.com* <http://www.talend.com/>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: *dev-unsubscribe@spark.apache.org*
> <de...@spark.apache.org>
> For additional commands, e-mail: *dev-help@spark.apache.org*
> <de...@spark.apache.org>
>
>
>
>
>
>

Re: Master build fails ?

Posted by Dilip Biswal <db...@us.ibm.com>.
Hello Ted,

Thanks for your response.

Here is the command i used :

build/sbt clean
build/sbt -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver 
-Dhadoop.version=2.6.0 -DskipTests assembly

I am building on CentOS and on master branch.

One other thing, i was able to build fine with the above command up until 
recently. I think i have stared
to have problem after SPARK-11073 where the HashCodes import was added.

Regards,
Dilip Biswal
Tel: 408-463-4980
dbiswal@us.ibm.com



From:   Ted Yu <yu...@gmail.com>
To:     Dilip Biswal/Oakland/IBM@IBMUS
Cc:     Jean-Baptiste Onofré <jb...@nanthrax.net>, "dev@spark.apache.org" 
<de...@spark.apache.org>
Date:   11/05/2015 10:46 AM
Subject:        Re: Master build fails ?



Dilip:
Can you give the command you used ?

Which release were you building ?
What OS did you build on ?

Cheers

On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <db...@us.ibm.com> wrote:
Hello,

I am getting the same build error about not being able to find 
com.google.common.hash.HashCodes.

Is there a solution to this ?

Regards,
Dilip Biswal
Tel: 408-463-4980
dbiswal@us.ibm.com



From:        Jean-Baptiste Onofré <jb...@nanthrax.net>
To:        Ted Yu <yu...@gmail.com>
Cc:        "dev@spark.apache.org" <de...@spark.apache.org>
Date:        11/03/2015 07:20 AM
Subject:        Re: Master build fails ?



Hi Ted,

thanks for the update. The build with sbt is in progress on my box.

Regards
JB

On 11/03/2015 03:31 PM, Ted Yu wrote:
> Interesting, Sbt builds were not all failing:
>
> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
>
> FYI
>
> On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
> <ma...@nanthrax.net>> wrote:

>
>     Hi Jacek,
>
>     it works fine with mvn: the problem is with sbt.
>
>     I suspect a different reactor order in sbt compare to mvn.
>
>     Regards
>     JB
>
>     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>
>         Hi,
>
>         Just built the sources using the following command and it worked
>         fine.
>
>         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>         -DskipTests clean install
>         ...
>         [INFO]
>         
------------------------------------------------------------------------
>         [INFO] BUILD SUCCESS
>         [INFO]
>         
------------------------------------------------------------------------
>         [INFO] Total time: 14:15 min
>         [INFO] Finished at: 2015-11-03T14:40:40+01:00
>         [INFO] Final Memory: 438M/1972M
>         [INFO]
>         
------------------------------------------------------------------------
>
>         ➜  spark git:(master) ✗ java -version
>         java version "1.8.0_66"
>         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>
>         I'm on Mac OS.
>
>         Pozdrawiam,
>         Jacek
>
>         --
>         Jacek Laskowski | 
http://blog.japila.pl|
>         http://blog.jaceklaskowski.pl

>         Follow me at https://twitter.com/jaceklaskowski
>         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski

>
>
>         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
>         <jb@nanthrax.net <ma...@nanthrax.net>> wrote:
>
>             Thanks for the update, I used mvn to build but without hive
>             profile.
>
>             Let me try with mvn with the same options as you and sbt 
also.
>
>             I keep you posted.
>
>             Regards
>             JB
>
>             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>
>
>                 I found it is due to SPARK-11073.
>
>                 Here's the command I used to build
>
>                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
>                 -Phive-thriftserver
>                 -Psparkr
>
>                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
>                 <jb@nanthrax.net <ma...@nanthrax.net>
>                 <ma...@nanthrax.net>>> wrote:

>
>                       Hi Jeff,
>
>                       it works for me (with skipping the tests).
>
>                       Let me try again, just to be sure.
>
>                       Regards
>                       JB
>
>
>                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>
>                           Looks like it's due to guava version
>                 conflicts, I see both guava
>                           14.0.1
>                           and 16.0.1 under lib_managed/bundles. Anyone
>                 meet this issue too ?
>
>                           [error]
>
>                 
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>                           object HashCodes is not a member of package
>                 com.google.common.hash
>                           [error] import 
com.google.common.hash.HashCodes
>                           [error]        ^
>                           [info] Resolving
>                 org.apache.commons#commons-math;2.2 ...
>                           [error]
>
>                 
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>                           not found: value HashCodes
>                           [error]         val cookie =
>                 HashCodes.fromBytes(secret).toString()
>                           [error]                      ^
>
>
>
>
>                           --
>                           Best Regards
>
>                           Jeff Zhang
>
>
>                       --
>                       Jean-Baptiste Onofré
>                 jbonofre@apache.org <
mailto:jbonofre@apache.org>
>                 <ma...@apache.org>>
>                 http://blog.nanthrax.net
>                       Talend - http://www.talend.com
>
>
>                   
---------------------------------------------------------------------
>                       To unsubscribe, e-mail:
>                 dev-unsubscribe@spark.apache.org
>                 <ma...@spark.apache.org>
>                       <mailto:dev-unsubscribe@spark.apache.org
>                 <ma...@spark.apache.org>>
>                       For additional commands, e-mail:
>                 dev-help@spark.apache.org <
mailto:dev-help@spark.apache.org>
>                       <mailto:dev-help@spark.apache.org

>                 <ma...@spark.apache.org>>
>
>
>
>
>                 --
>                 Best Regards
>
>                 Jeff Zhang
>
>
>
>             --
>             Jean-Baptiste Onofré
>             jbonofre@apache.org <ma...@apache.org>
>             http://blog.nanthrax.net
>             Talend - http://www.talend.com
>
>             
---------------------------------------------------------------------
>             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>             <ma...@spark.apache.org>
>             For additional commands, e-mail: dev-help@spark.apache.org
>             <ma...@spark.apache.org>
>
>
>         
---------------------------------------------------------------------
>         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>         <ma...@spark.apache.org>
>         For additional commands, e-mail: dev-help@spark.apache.org
>         <ma...@spark.apache.org>
>
>
>     --
>     Jean-Baptiste Onofré
>     jbonofre@apache.org <ma...@apache.org>
>     http://blog.nanthrax.net
>     Talend - http://www.talend.com
>
>     
---------------------------------------------------------------------
>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     For additional commands, e-mail: dev-help@spark.apache.org
>     <ma...@spark.apache.org>
>
>

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org







Re: Master build fails ?

Posted by Ted Yu <yu...@gmail.com>.
Dilip:
Can you give the command you used ?

Which release were you building ?
What OS did you build on ?

Cheers

On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <db...@us.ibm.com> wrote:

> Hello,
>
> I am getting the same build error about not being able to find
> com.google.common.hash.HashCodes.
>
> Is there a solution to this ?
>
> Regards,
> Dilip Biswal
> Tel: 408-463-4980
> dbiswal@us.ibm.com
>
>
>
> From:        Jean-Baptiste Onofré <jb...@nanthrax.net>
> To:        Ted Yu <yu...@gmail.com>
> Cc:        "dev@spark.apache.org" <de...@spark.apache.org>
> Date:        11/03/2015 07:20 AM
> Subject:        Re: Master build fails ?
> ------------------------------
>
>
>
> Hi Ted,
>
> thanks for the update. The build with sbt is in progress on my box.
>
> Regards
> JB
>
> On 11/03/2015 03:31 PM, Ted Yu wrote:
> > Interesting, Sbt builds were not all failing:
> >
> > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
> >
> > FYI
> >
> > On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
> > <mailto:jb@nanthrax.net <jb...@nanthrax.net>>> wrote:
>
> >
> >     Hi Jacek,
> >
> >     it works fine with mvn: the problem is with sbt.
> >
> >     I suspect a different reactor order in sbt compare to mvn.
> >
> >     Regards
> >     JB
> >
> >     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
> >
> >         Hi,
> >
> >         Just built the sources using the following command and it worked
> >         fine.
> >
> >         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
> >         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
> >         -DskipTests clean install
> >         ...
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >         [INFO] BUILD SUCCESS
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >         [INFO] Total time: 14:15 min
> >         [INFO] Finished at: 2015-11-03T14:40:40+01:00
> >         [INFO] Final Memory: 438M/1972M
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >
> >         ➜  spark git:(master) ✗ java -version
> >         java version "1.8.0_66"
> >         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
> >         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
> >
> >         I'm on Mac OS.
> >
> >         Pozdrawiam,
> >         Jacek
> >
> >         --
> >         Jacek Laskowski |
> http://blog.japila.pl|
> >         http://blog.jaceklaskowski.pl
>
> >         Follow me at https://twitter.com/jaceklaskowski
> >         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>
> >
> >
> >         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
> >         <jb@nanthrax.net <mailto:jb@nanthrax.net <jb...@nanthrax.net>>>
> wrote:
> >
> >             Thanks for the update, I used mvn to build but without hive
> >             profile.
> >
> >             Let me try with mvn with the same options as you and sbt
> also.
> >
> >             I keep you posted.
> >
> >             Regards
> >             JB
> >
> >             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
> >
> >
> >                 I found it is due to SPARK-11073.
> >
> >                 Here's the command I used to build
> >
> >                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
> >                 -Phive-thriftserver
> >                 -Psparkr
> >
> >                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
> >                 <jb@nanthrax.net <mailto:jb@nanthrax.net
> <jb...@nanthrax.net>>
> >                 <mailto:jb@nanthrax.net <jb...@nanthrax.net><
> mailto:jb@nanthrax.net <jb...@nanthrax.net>>>> wrote:
>
> >
> >                       Hi Jeff,
> >
> >                       it works for me (with skipping the tests).
> >
> >                       Let me try again, just to be sure.
> >
> >                       Regards
> >                       JB
> >
> >
> >                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
> >
> >                           Looks like it's due to guava version
> >                 conflicts, I see both guava
> >                           14.0.1
> >                           and 16.0.1 under lib_managed/bundles. Anyone
> >                 meet this issue too ?
> >
> >                           [error]
> >
> >
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
> >                           object HashCodes is not a member of package
> >                 com.google.common.hash
> >                           [error] import com.google.common.hash.HashCodes
> >                           [error]        ^
> >                           [info] Resolving
> >                 org.apache.commons#commons-math;2.2 ...
> >                           [error]
> >
> >
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
> >                           not found: value HashCodes
> >                           [error]         val cookie =
> >                 HashCodes.fromBytes(secret).toString()
> >                           [error]                      ^
> >
> >
> >
> >
> >                           --
> >                           Best Regards
> >
> >                           Jeff Zhang
> >
> >
> >                       --
> >                       Jean-Baptiste Onofré
> >                 jbonofre@apache.org <
> mailto:jbonofre@apache.org <jb...@apache.org>>
> >                 <mailto:jbonofre@apache.org <jb...@apache.org><
> mailto:jbonofre@apache.org <jb...@apache.org>>>
> >                 http://blog.nanthrax.net
> >                       Talend - http://www.talend.com
> >
> >
> >
> ---------------------------------------------------------------------
> >                       To unsubscribe, e-mail:
> >                 dev-unsubscribe@spark.apache.org
> >                 <mailto:dev-unsubscribe@spark.apache.org
> <de...@spark.apache.org>>
> >                       <mailto:dev-unsubscribe@spark.apache.org
> <de...@spark.apache.org>
> >                 <mailto:dev-unsubscribe@spark.apache.org
> <de...@spark.apache.org>>>
> >                       For additional commands, e-mail:
> >                 dev-help@spark.apache.org <
> mailto:dev-help@spark.apache.org <de...@spark.apache.org>>
> >                       <mailto:dev-help@spark.apache.org
> <de...@spark.apache.org>
>
> >                 <mailto:dev-help@spark.apache.org
> <de...@spark.apache.org>>>
> >
> >
> >
> >
> >                 --
> >                 Best Regards
> >
> >                 Jeff Zhang
> >
> >
> >
> >             --
> >             Jean-Baptiste Onofré
> >             jbonofre@apache.org <mailto:jbonofre@apache.org
> <jb...@apache.org>>
> >             http://blog.nanthrax.net
> >             Talend - http://www.talend.com
> >
> >
> ---------------------------------------------------------------------
> >             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >             <mailto:dev-unsubscribe@spark.apache.org
> <de...@spark.apache.org>>
> >             For additional commands, e-mail: dev-help@spark.apache.org
> >             <mailto:dev-help@spark.apache.org
> <de...@spark.apache.org>>
> >
> >
> >
> ---------------------------------------------------------------------
> >         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >         <mailto:dev-unsubscribe@spark.apache.org
> <de...@spark.apache.org>>
> >         For additional commands, e-mail: dev-help@spark.apache.org
> >         <mailto:dev-help@spark.apache.org <de...@spark.apache.org>>
> >
> >
> >     --
> >     Jean-Baptiste Onofré
> >     jbonofre@apache.org <mailto:jbonofre@apache.org
> <jb...@apache.org>>
> >     http://blog.nanthrax.net
> >     Talend - http://www.talend.com
> >
> >     ---------------------------------------------------------------------
> >     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >     <mailto:dev-unsubscribe@spark.apache.org
> <de...@spark.apache.org>>
> >     For additional commands, e-mail: dev-help@spark.apache.org
> >     <mailto:dev-help@spark.apache.org <de...@spark.apache.org>>
> >
> >
>
> --
> Jean-Baptiste Onofré
> jbonofre@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>
>
>

Re: Master build fails ?

Posted by Dilip Biswal <db...@us.ibm.com>.
Hello,

I am getting the same build error about not being able to find 
com.google.common.hash.HashCodes.

Is there a solution to this ?

Regards,
Dilip Biswal
Tel: 408-463-4980
dbiswal@us.ibm.com



From:   Jean-Baptiste Onofré <jb...@nanthrax.net>
To:     Ted Yu <yu...@gmail.com>
Cc:     "dev@spark.apache.org" <de...@spark.apache.org>
Date:   11/03/2015 07:20 AM
Subject:        Re: Master build fails ?



Hi Ted,

thanks for the update. The build with sbt is in progress on my box.

Regards
JB

On 11/03/2015 03:31 PM, Ted Yu wrote:
> Interesting, Sbt builds were not all failing:
>
> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
>
> FYI
>
> On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
> <ma...@nanthrax.net>> wrote:
>
>     Hi Jacek,
>
>     it works fine with mvn: the problem is with sbt.
>
>     I suspect a different reactor order in sbt compare to mvn.
>
>     Regards
>     JB
>
>     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>
>         Hi,
>
>         Just built the sources using the following command and it worked
>         fine.
>
>         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>         -DskipTests clean install
>         ...
>         [INFO]
> ------------------------------------------------------------------------
>         [INFO] BUILD SUCCESS
>         [INFO]
> ------------------------------------------------------------------------
>         [INFO] Total time: 14:15 min
>         [INFO] Finished at: 2015-11-03T14:40:40+01:00
>         [INFO] Final Memory: 438M/1972M
>         [INFO]
> ------------------------------------------------------------------------
>
>         ➜  spark git:(master) ✗ java -version
>         java version "1.8.0_66"
>         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>
>         I'm on Mac OS.
>
>         Pozdrawiam,
>         Jacek
>
>         --
>         Jacek Laskowski | http://blog.japila.pl |
>         http://blog.jaceklaskowski.pl
>         Follow me at https://twitter.com/jaceklaskowski
>         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>
>
>         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
>         <jb@nanthrax.net <ma...@nanthrax.net>> wrote:
>
>             Thanks for the update, I used mvn to build but without hive
>             profile.
>
>             Let me try with mvn with the same options as you and sbt 
also.
>
>             I keep you posted.
>
>             Regards
>             JB
>
>             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>
>
>                 I found it is due to SPARK-11073.
>
>                 Here's the command I used to build
>
>                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
>                 -Phive-thriftserver
>                 -Psparkr
>
>                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
>                 <jb@nanthrax.net <ma...@nanthrax.net>
>                 <mailto:jb@nanthrax.net <ma...@nanthrax.net>>> 
wrote:
>
>                       Hi Jeff,
>
>                       it works for me (with skipping the tests).
>
>                       Let me try again, just to be sure.
>
>                       Regards
>                       JB
>
>
>                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>
>                           Looks like it's due to guava version
>                 conflicts, I see both guava
>                           14.0.1
>                           and 16.0.1 under lib_managed/bundles. Anyone
>                 meet this issue too ?
>
>                           [error]
>
> 
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>                           object HashCodes is not a member of package
>                 com.google.common.hash
>                           [error] import 
com.google.common.hash.HashCodes
>                           [error]        ^
>                           [info] Resolving
>                 org.apache.commons#commons-math;2.2 ...
>                           [error]
>
> 
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>                           not found: value HashCodes
>                           [error]         val cookie =
>                 HashCodes.fromBytes(secret).toString()
>                           [error]                      ^
>
>
>
>
>                           --
>                           Best Regards
>
>                           Jeff Zhang
>
>
>                       --
>                       Jean-Baptiste Onofré
>                 jbonofre@apache.org <ma...@apache.org>
>                 <mailto:jbonofre@apache.org <mailto:jbonofre@apache.org
>>
>                 http://blog.nanthrax.net
>                       Talend - http://www.talend.com
>
>
> ---------------------------------------------------------------------
>                       To unsubscribe, e-mail:
>                 dev-unsubscribe@spark.apache.org
>                 <ma...@spark.apache.org>
>                       <mailto:dev-unsubscribe@spark.apache.org
>                 <ma...@spark.apache.org>>
>                       For additional commands, e-mail:
>                 dev-help@spark.apache.org <
mailto:dev-help@spark.apache.org>
>                       <mailto:dev-help@spark.apache.org
>                 <ma...@spark.apache.org>>
>
>
>
>
>                 --
>                 Best Regards
>
>                 Jeff Zhang
>
>
>
>             --
>             Jean-Baptiste Onofré
>             jbonofre@apache.org <ma...@apache.org>
>             http://blog.nanthrax.net
>             Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
>             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>             <ma...@spark.apache.org>
>             For additional commands, e-mail: dev-help@spark.apache.org
>             <ma...@spark.apache.org>
>
>
> ---------------------------------------------------------------------
>         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>         <ma...@spark.apache.org>
>         For additional commands, e-mail: dev-help@spark.apache.org
>         <ma...@spark.apache.org>
>
>
>     --
>     Jean-Baptiste Onofré
>     jbonofre@apache.org <ma...@apache.org>
>     http://blog.nanthrax.net
>     Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     For additional commands, e-mail: dev-help@spark.apache.org
>     <ma...@spark.apache.org>
>
>

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org





Re: Master build fails ?

Posted by Jean-Baptiste Onofré <jb...@nanthrax.net>.
Hi Ted,

thanks for the update. The build with sbt is in progress on my box.

Regards
JB

On 11/03/2015 03:31 PM, Ted Yu wrote:
> Interesting, Sbt builds were not all failing:
>
> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
>
> FYI
>
> On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb@nanthrax.net
> <ma...@nanthrax.net>> wrote:
>
>     Hi Jacek,
>
>     it works fine with mvn: the problem is with sbt.
>
>     I suspect a different reactor order in sbt compare to mvn.
>
>     Regards
>     JB
>
>     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>
>         Hi,
>
>         Just built the sources using the following command and it worked
>         fine.
>
>         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>         -DskipTests clean install
>         ...
>         [INFO]
>         ------------------------------------------------------------------------
>         [INFO] BUILD SUCCESS
>         [INFO]
>         ------------------------------------------------------------------------
>         [INFO] Total time: 14:15 min
>         [INFO] Finished at: 2015-11-03T14:40:40+01:00
>         [INFO] Final Memory: 438M/1972M
>         [INFO]
>         ------------------------------------------------------------------------
>
>         ➜  spark git:(master) ✗ java -version
>         java version "1.8.0_66"
>         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>
>         I'm on Mac OS.
>
>         Pozdrawiam,
>         Jacek
>
>         --
>         Jacek Laskowski | http://blog.japila.pl |
>         http://blog.jaceklaskowski.pl
>         Follow me at https://twitter.com/jaceklaskowski
>         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>
>
>         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
>         <jb@nanthrax.net <ma...@nanthrax.net>> wrote:
>
>             Thanks for the update, I used mvn to build but without hive
>             profile.
>
>             Let me try with mvn with the same options as you and sbt also.
>
>             I keep you posted.
>
>             Regards
>             JB
>
>             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>
>
>                 I found it is due to SPARK-11073.
>
>                 Here's the command I used to build
>
>                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
>                 -Phive-thriftserver
>                 -Psparkr
>
>                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
>                 <jb@nanthrax.net <ma...@nanthrax.net>
>                 <mailto:jb@nanthrax.net <ma...@nanthrax.net>>> wrote:
>
>                       Hi Jeff,
>
>                       it works for me (with skipping the tests).
>
>                       Let me try again, just to be sure.
>
>                       Regards
>                       JB
>
>
>                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>
>                           Looks like it's due to guava version
>                 conflicts, I see both guava
>                           14.0.1
>                           and 16.0.1 under lib_managed/bundles. Anyone
>                 meet this issue too ?
>
>                           [error]
>
>                 /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>                           object HashCodes is not a member of package
>                 com.google.common.hash
>                           [error] import com.google.common.hash.HashCodes
>                           [error]        ^
>                           [info] Resolving
>                 org.apache.commons#commons-math;2.2 ...
>                           [error]
>
>                 /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>                           not found: value HashCodes
>                           [error]         val cookie =
>                 HashCodes.fromBytes(secret).toString()
>                           [error]                      ^
>
>
>
>
>                           --
>                           Best Regards
>
>                           Jeff Zhang
>
>
>                       --
>                       Jean-Baptiste Onofré
>                 jbonofre@apache.org <ma...@apache.org>
>                 <mailto:jbonofre@apache.org <ma...@apache.org>>
>                 http://blog.nanthrax.net
>                       Talend - http://www.talend.com
>
>
>                   ---------------------------------------------------------------------
>                       To unsubscribe, e-mail:
>                 dev-unsubscribe@spark.apache.org
>                 <ma...@spark.apache.org>
>                       <mailto:dev-unsubscribe@spark.apache.org
>                 <ma...@spark.apache.org>>
>                       For additional commands, e-mail:
>                 dev-help@spark.apache.org <ma...@spark.apache.org>
>                       <mailto:dev-help@spark.apache.org
>                 <ma...@spark.apache.org>>
>
>
>
>
>                 --
>                 Best Regards
>
>                 Jeff Zhang
>
>
>
>             --
>             Jean-Baptiste Onofré
>             jbonofre@apache.org <ma...@apache.org>
>             http://blog.nanthrax.net
>             Talend - http://www.talend.com
>
>             ---------------------------------------------------------------------
>             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>             <ma...@spark.apache.org>
>             For additional commands, e-mail: dev-help@spark.apache.org
>             <ma...@spark.apache.org>
>
>
>         ---------------------------------------------------------------------
>         To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>         <ma...@spark.apache.org>
>         For additional commands, e-mail: dev-help@spark.apache.org
>         <ma...@spark.apache.org>
>
>
>     --
>     Jean-Baptiste Onofré
>     jbonofre@apache.org <ma...@apache.org>
>     http://blog.nanthrax.net
>     Talend - http://www.talend.com
>
>     ---------------------------------------------------------------------
>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     For additional commands, e-mail: dev-help@spark.apache.org
>     <ma...@spark.apache.org>
>
>

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Ted Yu <yu...@gmail.com>.
Interesting, Sbt builds were not all failing:

https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/

FYI

On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <jb...@nanthrax.net>
wrote:

> Hi Jacek,
>
> it works fine with mvn: the problem is with sbt.
>
> I suspect a different reactor order in sbt compare to mvn.
>
> Regards
> JB
>
> On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
>
>> Hi,
>>
>> Just built the sources using the following command and it worked fine.
>>
>> ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
>> -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
>> -DskipTests clean install
>> ...
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] BUILD SUCCESS
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Total time: 14:15 min
>> [INFO] Finished at: 2015-11-03T14:40:40+01:00
>> [INFO] Final Memory: 438M/1972M
>> [INFO]
>> ------------------------------------------------------------------------
>>
>> ➜  spark git:(master) ✗ java -version
>> java version "1.8.0_66"
>> Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
>> Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>>
>> I'm on Mac OS.
>>
>> Pozdrawiam,
>> Jacek
>>
>> --
>> Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
>> Follow me at https://twitter.com/jaceklaskowski
>> Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>>
>>
>> On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré <jb...@nanthrax.net>
>> wrote:
>>
>>> Thanks for the update, I used mvn to build but without hive profile.
>>>
>>> Let me try with mvn with the same options as you and sbt also.
>>>
>>> I keep you posted.
>>>
>>> Regards
>>> JB
>>>
>>> On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>>
>>>>
>>>> I found it is due to SPARK-11073.
>>>>
>>>> Here's the command I used to build
>>>>
>>>> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>>> -Psparkr
>>>>
>>>> On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <jb@nanthrax.net
>>>> <ma...@nanthrax.net>> wrote:
>>>>
>>>>      Hi Jeff,
>>>>
>>>>      it works for me (with skipping the tests).
>>>>
>>>>      Let me try again, just to be sure.
>>>>
>>>>      Regards
>>>>      JB
>>>>
>>>>
>>>>      On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>>>
>>>>          Looks like it's due to guava version conflicts, I see both
>>>> guava
>>>>          14.0.1
>>>>          and 16.0.1 under lib_managed/bundles. Anyone meet this issue
>>>> too ?
>>>>
>>>>          [error]
>>>>
>>>>
>>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>>>          object HashCodes is not a member of package
>>>> com.google.common.hash
>>>>          [error] import com.google.common.hash.HashCodes
>>>>          [error]        ^
>>>>          [info] Resolving org.apache.commons#commons-math;2.2 ...
>>>>          [error]
>>>>
>>>>
>>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>>>          not found: value HashCodes
>>>>          [error]         val cookie =
>>>> HashCodes.fromBytes(secret).toString()
>>>>          [error]                      ^
>>>>
>>>>
>>>>
>>>>
>>>>          --
>>>>          Best Regards
>>>>
>>>>          Jeff Zhang
>>>>
>>>>
>>>>      --
>>>>      Jean-Baptiste Onofré
>>>>      jbonofre@apache.org <ma...@apache.org>
>>>>      http://blog.nanthrax.net
>>>>      Talend - http://www.talend.com
>>>>
>>>>
>>>>  ---------------------------------------------------------------------
>>>>      To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>      <ma...@spark.apache.org>
>>>>      For additional commands, e-mail: dev-help@spark.apache.org
>>>>      <ma...@spark.apache.org>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>>
>>>
>>>
>>> --
>>> Jean-Baptiste Onofré
>>> jbonofre@apache.org
>>> http://blog.nanthrax.net
>>> Talend - http://www.talend.com
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
> --
> Jean-Baptiste Onofré
> jbonofre@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Master build fails ?

Posted by Jean-Baptiste Onofré <jb...@nanthrax.net>.
Hi Jacek,

it works fine with mvn: the problem is with sbt.

I suspect a different reactor order in sbt compare to mvn.

Regards
JB

On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
> Hi,
>
> Just built the sources using the following command and it worked fine.
>
> ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
> -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
> -DskipTests clean install
> ...
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD SUCCESS
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 14:15 min
> [INFO] Finished at: 2015-11-03T14:40:40+01:00
> [INFO] Final Memory: 438M/1972M
> [INFO] ------------------------------------------------------------------------
>
> ➜  spark git:(master) ✗ java -version
> java version "1.8.0_66"
> Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
> Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
>
> I'm on Mac OS.
>
> Pozdrawiam,
> Jacek
>
> --
> Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
> Follow me at https://twitter.com/jaceklaskowski
> Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>
>
> On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré <jb...@nanthrax.net> wrote:
>> Thanks for the update, I used mvn to build but without hive profile.
>>
>> Let me try with mvn with the same options as you and sbt also.
>>
>> I keep you posted.
>>
>> Regards
>> JB
>>
>> On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>>
>>> I found it is due to SPARK-11073.
>>>
>>> Here's the command I used to build
>>>
>>> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>>> -Psparkr
>>>
>>> On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <jb@nanthrax.net
>>> <ma...@nanthrax.net>> wrote:
>>>
>>>      Hi Jeff,
>>>
>>>      it works for me (with skipping the tests).
>>>
>>>      Let me try again, just to be sure.
>>>
>>>      Regards
>>>      JB
>>>
>>>
>>>      On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>>
>>>          Looks like it's due to guava version conflicts, I see both guava
>>>          14.0.1
>>>          and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>>>
>>>          [error]
>>>
>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>>          object HashCodes is not a member of package com.google.common.hash
>>>          [error] import com.google.common.hash.HashCodes
>>>          [error]        ^
>>>          [info] Resolving org.apache.commons#commons-math;2.2 ...
>>>          [error]
>>>
>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>>          not found: value HashCodes
>>>          [error]         val cookie =
>>> HashCodes.fromBytes(secret).toString()
>>>          [error]                      ^
>>>
>>>
>>>
>>>
>>>          --
>>>          Best Regards
>>>
>>>          Jeff Zhang
>>>
>>>
>>>      --
>>>      Jean-Baptiste Onofré
>>>      jbonofre@apache.org <ma...@apache.org>
>>>      http://blog.nanthrax.net
>>>      Talend - http://www.talend.com
>>>
>>>      ---------------------------------------------------------------------
>>>      To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>      <ma...@spark.apache.org>
>>>      For additional commands, e-mail: dev-help@spark.apache.org
>>>      <ma...@spark.apache.org>
>>>
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>
>>
>> --
>> Jean-Baptiste Onofré
>> jbonofre@apache.org
>> http://blog.nanthrax.net
>> Talend - http://www.talend.com
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Just built the sources using the following command and it worked fine.

➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
-Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
-DskipTests clean install
...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:15 min
[INFO] Finished at: 2015-11-03T14:40:40+01:00
[INFO] Final Memory: 438M/1972M
[INFO] ------------------------------------------------------------------------

➜  spark git:(master) ✗ java -version
java version "1.8.0_66"
Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)

I'm on Mac OS.

Pozdrawiam,
Jacek

--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski


On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré <jb...@nanthrax.net> wrote:
> Thanks for the update, I used mvn to build but without hive profile.
>
> Let me try with mvn with the same options as you and sbt also.
>
> I keep you posted.
>
> Regards
> JB
>
> On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>
>> I found it is due to SPARK-11073.
>>
>> Here's the command I used to build
>>
>> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>> -Psparkr
>>
>> On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <jb@nanthrax.net
>> <ma...@nanthrax.net>> wrote:
>>
>>     Hi Jeff,
>>
>>     it works for me (with skipping the tests).
>>
>>     Let me try again, just to be sure.
>>
>>     Regards
>>     JB
>>
>>
>>     On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>
>>         Looks like it's due to guava version conflicts, I see both guava
>>         14.0.1
>>         and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>>
>>         [error]
>>
>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>         object HashCodes is not a member of package com.google.common.hash
>>         [error] import com.google.common.hash.HashCodes
>>         [error]        ^
>>         [info] Resolving org.apache.commons#commons-math;2.2 ...
>>         [error]
>>
>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>         not found: value HashCodes
>>         [error]         val cookie =
>> HashCodes.fromBytes(secret).toString()
>>         [error]                      ^
>>
>>
>>
>>
>>         --
>>         Best Regards
>>
>>         Jeff Zhang
>>
>>
>>     --
>>     Jean-Baptiste Onofré
>>     jbonofre@apache.org <ma...@apache.org>
>>     http://blog.nanthrax.net
>>     Talend - http://www.talend.com
>>
>>     ---------------------------------------------------------------------
>>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>     <ma...@spark.apache.org>
>>     For additional commands, e-mail: dev-help@spark.apache.org
>>     <ma...@spark.apache.org>
>>
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>
>
> --
> Jean-Baptiste Onofré
> jbonofre@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Jean-Baptiste Onofré <jb...@nanthrax.net>.
Thanks for the update, I used mvn to build but without hive profile.

Let me try with mvn with the same options as you and sbt also.

I keep you posted.

Regards
JB

On 11/03/2015 12:55 PM, Jeff Zhang wrote:
> I found it is due to SPARK-11073.
>
> Here's the command I used to build
>
> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
> -Psparkr
>
> On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <jb@nanthrax.net
> <ma...@nanthrax.net>> wrote:
>
>     Hi Jeff,
>
>     it works for me (with skipping the tests).
>
>     Let me try again, just to be sure.
>
>     Regards
>     JB
>
>
>     On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>
>         Looks like it's due to guava version conflicts, I see both guava
>         14.0.1
>         and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>
>         [error]
>         /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>         object HashCodes is not a member of package com.google.common.hash
>         [error] import com.google.common.hash.HashCodes
>         [error]        ^
>         [info] Resolving org.apache.commons#commons-math;2.2 ...
>         [error]
>         /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>         not found: value HashCodes
>         [error]         val cookie = HashCodes.fromBytes(secret).toString()
>         [error]                      ^
>
>
>
>
>         --
>         Best Regards
>
>         Jeff Zhang
>
>
>     --
>     Jean-Baptiste Onofré
>     jbonofre@apache.org <ma...@apache.org>
>     http://blog.nanthrax.net
>     Talend - http://www.talend.com
>
>     ---------------------------------------------------------------------
>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     For additional commands, e-mail: dev-help@spark.apache.org
>     <ma...@spark.apache.org>
>
>
>
>
> --
> Best Regards
>
> Jeff Zhang

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Saisai Shao <sa...@gmail.com>.
Yeah, I also met this problem, just curious why jenkins test is OK.

On Tue, Nov 3, 2015 at 7:55 PM, Jeff Zhang <zj...@gmail.com> wrote:

> I found it is due to SPARK-11073.
>
> Here's the command I used to build
>
> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
> -Psparkr
>
> On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <jb...@nanthrax.net>
> wrote:
>
>> Hi Jeff,
>>
>> it works for me (with skipping the tests).
>>
>> Let me try again, just to be sure.
>>
>> Regards
>> JB
>>
>>
>> On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>
>>> Looks like it's due to guava version conflicts, I see both guava 14.0.1
>>> and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>>>
>>> [error]
>>>
>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>> object HashCodes is not a member of package com.google.common.hash
>>> [error] import com.google.common.hash.HashCodes
>>> [error]        ^
>>> [info] Resolving org.apache.commons#commons-math;2.2 ...
>>> [error]
>>>
>>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>> not found: value HashCodes
>>> [error]         val cookie = HashCodes.fromBytes(secret).toString()
>>> [error]                      ^
>>>
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>
>> --
>> Jean-Baptiste Onofré
>> jbonofre@apache.org
>> http://blog.nanthrax.net
>> Talend - http://www.talend.com
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: Master build fails ?

Posted by Jeff Zhang <zj...@gmail.com>.
I found it is due to SPARK-11073.

Here's the command I used to build

build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
-Psparkr

On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <jb...@nanthrax.net>
wrote:

> Hi Jeff,
>
> it works for me (with skipping the tests).
>
> Let me try again, just to be sure.
>
> Regards
> JB
>
>
> On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>
>> Looks like it's due to guava version conflicts, I see both guava 14.0.1
>> and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>>
>> [error]
>>
>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>> object HashCodes is not a member of package com.google.common.hash
>> [error] import com.google.common.hash.HashCodes
>> [error]        ^
>> [info] Resolving org.apache.commons#commons-math;2.2 ...
>> [error]
>>
>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>> not found: value HashCodes
>> [error]         val cookie = HashCodes.fromBytes(secret).toString()
>> [error]                      ^
>>
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
> --
> Jean-Baptiste Onofré
> jbonofre@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>


-- 
Best Regards

Jeff Zhang

Re: Master build fails ?

Posted by Jean-Baptiste Onofré <jb...@nanthrax.net>.
Hi Jeff,

it works for me (with skipping the tests).

Let me try again, just to be sure.

Regards
JB

On 11/03/2015 11:50 AM, Jeff Zhang wrote:
> Looks like it's due to guava version conflicts, I see both guava 14.0.1
> and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>
> [error]
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
> object HashCodes is not a member of package com.google.common.hash
> [error] import com.google.common.hash.HashCodes
> [error]        ^
> [info] Resolving org.apache.commons#commons-math;2.2 ...
> [error]
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
> not found: value HashCodes
> [error]         val cookie = HashCodes.fromBytes(secret).toString()
> [error]                      ^
>
>
>
>
> --
> Best Regards
>
> Jeff Zhang

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Steve Loughran <st...@hortonworks.com>.
SBT/ivy pulls in the most recent version of a JAR in, whereas maven pulls in the "closest", where closest is lowest distance/depth from the root.


> On 5 Nov 2015, at 18:53, Marcelo Vanzin <va...@cloudera.com> wrote:
> 
> Seems like it's an sbt issue, not a maven one, so "dependency:tree"
> might not help. Still, the command line would be helpful. I use sbt
> and don't see this.
> 
> On Thu, Nov 5, 2015 at 10:44 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
>> Hi Jeff,
>> 
>> On Tue, Nov 3, 2015 at 2:50 AM, Jeff Zhang <zj...@gmail.com> wrote:
>>> Looks like it's due to guava version conflicts, I see both guava 14.0.1 and
>>> 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>> 
>> What command line are you using to build? Can you run "mvn
>> dependency:tree" (with all the other options you're using) to figure
>> out where guava 16 is coming from? Locally I only see version 14,
>> compiling against hadoop 2.5.0.
>> 
>> --
>> Marcelo
> 
> 
> 
> -- 
> Marcelo
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Seems like it's an sbt issue, not a maven one, so "dependency:tree"
might not help. Still, the command line would be helpful. I use sbt
and don't see this.

On Thu, Nov 5, 2015 at 10:44 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
> Hi Jeff,
>
> On Tue, Nov 3, 2015 at 2:50 AM, Jeff Zhang <zj...@gmail.com> wrote:
>> Looks like it's due to guava version conflicts, I see both guava 14.0.1 and
>> 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>
> What command line are you using to build? Can you run "mvn
> dependency:tree" (with all the other options you're using) to figure
> out where guava 16 is coming from? Locally I only see version 14,
> compiling against hadoop 2.5.0.
>
> --
> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Master build fails ?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Hi Jeff,

On Tue, Nov 3, 2015 at 2:50 AM, Jeff Zhang <zj...@gmail.com> wrote:
> Looks like it's due to guava version conflicts, I see both guava 14.0.1 and
> 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?

What command line are you using to build? Can you run "mvn
dependency:tree" (with all the other options you're using) to figure
out where guava 16 is coming from? Locally I only see version 14,
compiling against hadoop 2.5.0.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org