You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mesos.apache.org by Brenden Matthews <br...@diddyinc.com> on 2013/04/15 23:21:33 UTC

Review Request: Run Hadoop tutorial binaries from within build dir.

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/
-----------------------------------------------------------

Review request for mesos.


Description
-------

Run Hadoop tutorial binaries from within build dir.


Diffs
-----

  hadoop/TUTORIAL.sh f8131cd 

Diff: https://reviews.apache.org/r/10492/diff/


Testing
-------

Used in production at Airbnb.


Thanks,

Brenden Matthews


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Vinod Kone <vi...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/#review19256
-----------------------------------------------------------



hadoop/TUTORIAL.sh
<https://reviews.apache.org/r/10492/#comment39873>

    please confirm this works locally.


- Vinod Kone


On April 15, 2013, 9:21 p.m., Brenden Matthews wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
> 
> (Updated April 15, 2013, 9:21 p.m.)
> 
> 
> Review request for mesos.
> 
> 
> Description
> -------
> 
> Run Hadoop tutorial binaries from within build dir.
> 
> 
> Diffs
> -----
> 
>   hadoop/TUTORIAL.sh f8131cd 
> 
> Diff: https://reviews.apache.org/r/10492/diff/
> 
> 
> Testing
> -------
> 
> Used in production at Airbnb.
> 
> 
> Thanks,
> 
> Brenden Matthews
> 
>


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Vinod Kone <vi...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/#review19486
-----------------------------------------------------------

Ship it!


Ship It!

- Vinod Kone


On April 16, 2013, 10 p.m., Brenden Matthews wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
> 
> (Updated April 16, 2013, 10 p.m.)
> 
> 
> Review request for mesos.
> 
> 
> Description
> -------
> 
> Run Hadoop tutorial binaries from within build dir.
> 
> 
> Diffs
> -----
> 
>   hadoop/TUTORIAL.sh f8131cd 
> 
> Diff: https://reviews.apache.org/r/10492/diff/
> 
> 
> Testing
> -------
> 
> Used in production at Airbnb.
> 
> 
> Thanks,
> 
> Brenden Matthews
> 
>


Re: FYI -- Failure building Mesos on new AWS Linux AMI

Posted by "Mattmann, Chris A (398J)" <ch...@jpl.nasa.gov>.
Thanks Jim, sounds good.

Cheers,
Chris

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Chris Mattmann, Ph.D.
Senior Computer Scientist
NASA Jet Propulsion Laboratory Pasadena, CA 91109 USA
Office: 171-266B, Mailstop: 171-246
Email: chris.a.mattmann@nasa.gov
WWW:  http://sunset.usc.edu/~mattmann/
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Adjunct Assistant Professor, Computer Science Department
University of Southern California, Los Angeles, CA 90089 USA
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++






-----Original Message-----
From: Jim Donahue <jd...@adobe.com>
Reply-To: "mesos-dev@incubator.apache.org" <me...@incubator.apache.org>
Date: Saturday, April 20, 2013 4:19 PM
To: "mesos-dev@incubator.apache.org" <me...@incubator.apache.org>
Subject: FYI -- Failure building Mesos on new AWS Linux AMI

>For reasons that I can't possibly begin to understand, building Mesos
>fails on the most recent AWS Linux AMIs.  However, if you use ami-41814f28
>(and older 64-bit AMI), it builds fine.
>
>I made the mistake yesterday of switching to their new AMI and had lots of
>problems, which I first mistakenly attributed to Mesos.  My apologies ...
>I'll be taking up the issue with Amazon and I'll report back when I find
>something out.
>
>
>Jim
>


FYI -- Failure building Mesos on new AWS Linux AMI

Posted by Jim Donahue <jd...@adobe.com>.
For reasons that I can't possibly begin to understand, building Mesos
fails on the most recent AWS Linux AMIs.  However, if you use ami-41814f28
(and older 64-bit AMI), it builds fine.

I made the mistake yesterday of switching to their new AMI and had lots of
problems, which I first mistakenly attributed to Mesos.  My apologies ...
I'll be taking up the issue with Amazon and I'll report back when I find
something out.


Jim


Re: Did something change in building hadoop-0.20.205.0???

Posted by Jim Donahue <jd...@adobe.com>.
Well, I can't get into the Hadoop installation to answer the question … This afternoon, the build is failing earlier:

make[4]: Entering directory `/usr/local/var/mesos/third_party/leveldb'
g++ -pthread -shared -Wl,-soname -Wl,/usr/local/var/mesos/third_party/leveldb/libleveldb.so.1 -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX -DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -fPIC db/builder.cc db/c.cc db/dbformat.cc db/db_impl.cc db/db_iter.cc db/filename.cc db/log_reader.cc db/log_writer.cc db/memtable.cc db/repair.cc db/table_cache.cc db/version_edit.cc db/version_set.cc db/write_batch.cc table/block_builder.cc table/block.cc table/filter_block.cc table/format.cc table/iterator.cc table/merger.cc table/table_builder.cc table/table.cc table/two_level_iterator.cc util/arena.cc util/bloom.cc util/cache.cc util/coding.cc util/comparator.cc util/crc32c.cc util/env.cc util/env_posix.cc util/filter_policy.cc util/hash.cc util/histogram.cc util/logging.cc util/options.cc util/status.cc port/port_posix.cc -o libleveldb.so.1.4
ln -fs libleveldb.so.1.4 libleveldb.so
ln -fs libleveldb.so.1.4 libleveldb.so.1
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX -DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/builder.cc -o db/builder.o
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX -DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/c.cc -o db/c.o
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX -DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/dbformat.cc -o db/dbformat.o
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX -DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/db_impl.cc -o db/db_impl.o
Assembler messages:
Fatal error: can't create db/db_impl.o: No such file or directory
make[4]: *** [db/db_impl.o] Error 1
make[4]: Leaving directory `/usr/local/var/mesos/third_party/leveldb'
make[3]: *** [leveldb/libleveldb.a] Error 2
make[3]: Leaving directory `/usr/local/var/mesos/third_party'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/usr/local/var/mesos/third_party'
make[1]: *** [all] Error 2
make[1]: Leaving directory `/usr/local/var/mesos/third_party'
make: *** [all-recursive] Error 1



Jim


On 4/19/13 7:23 PM, "Benjamin Mahler" <be...@gmail.com>> wrote:

This was committed, as for your TUTORIAL.sh issues, can you confirm those
are consistently failing?

'make hadoop-0.20.205.0' is working correctly on my end.


On Fri, Apr 19, 2013 at 7:18 PM, Benjamin Mahler
<be...@gmail.com>>wrote:

Alright, I'm seeing a separate issue with the patch:

Patch conf/hadoop-env.sh? [N]

   $ patch -p1 <../hadoop-0.20.205.0_hadoop-env.sh.patch

Hit enter to continue.
patching file conf/hadoop-env.sh
Hunk #1 FAILED at 9.
1 out of 1 hunk FAILED -- saving rejects to file conf/hadoop-env.sh.rej

I have a fix that I will be committing shortly:
https://reviews.apache.org/r/10668/


On Fri, Apr 19, 2013 at 6:45 PM, Jim Donahue <jd...@adobe.com>> wrote:

Yup, I tried it on both a 32-bit and 64-bit Amazon Linux instance and got
the same behavior.  It is possible that I inadvertently made some change
in my build scripts that caused it to fail, but the scripts are pretty
stable and I can't think of any change that I made to both (32 and 64-bit)
of them that would cause a problem.

Jim

On 4/19/13 6:39 PM, "Benjamin Mahler" <be...@gmail.com>> wrote:

>Well that is unexpected. Is that consistently failing for you?
>
>
>On Fri, Apr 19, 2013 at 6:29 PM, Jim Donahue <jd...@adobe.com>> wrote:
>
>> Also the example seems to have stopped working Š  That's not a serious
>> problem (I can just ignore it) but it did work last week as far as I
can
>> remember. :-)
>>
>> Waiting 5 seconds for it to start. . . . . .
>> Alright, now let's run the "wordcount" example via:
>>
>> $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount
>> src/contrib/mesos/src/java/org/apache/hadoop/mapred out
>>
>> Hit enter to continue.
>> 13/04/19 23:31:54 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 0 time(s).
>> 13/04/19 23:31:55 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 1 time(s).
>> 13/04/19 23:31:56 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 2 time(s).
>> 13/04/19 23:31:57 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 3 time(s).
>> 13/04/19 23:31:58 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 4 time(s).
>> 13/04/19 23:31:59 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 5 time(s).
>> 13/04/19 23:32:00 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 6 time(s).
>> 13/04/19 23:32:01 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 7 time(s).
>> 13/04/19 23:32:02 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 8 time(s).
>> 13/04/19 23:32:03 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 9 time(s).
>> java.net.ConnectException: Call to localhost/127.0.0.1:54311 failed on
>> connection exception: java.net.ConnectException: Connection refused
>> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>> at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>> at
org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:478)
>> at org.apache.hadoop.mapred.JobClient.init(JobClient.java:472)
>> at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:455)
>> at org.apache.hadoop.mapreduce.Job$1.run(Job.java:478)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:416)
>> at
>>

>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
>>.java:1059)
>> at org.apache.hadoop.mapreduce.Job.connect(Job.java:476)
>> at org.apache.hadoop.mapreduce.Job.submit(Job.java:464)
>> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
>> at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>

>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>>:57)
>> at
>>

>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>>mpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at
>>

>>org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDri
>>ver.java:68)
>> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>

>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>>:57)
>> at
>>

>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>>mpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at
>>sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:592)
>> at
>>

>>org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.jav
>>a:206)
>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:604)
>> at
>>org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
>> at
>>org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
>> at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1046)
>> ... 27 more
>>
>> Oh no, it failed! Try running the JobTracker and wordcount
>> example manually ... it might be an issue with your environment that
>> this tutorial didn't cover (if you find this to be the case, please
>> create a JIRA for us and/or send us a code review).
>>
>> ./TUTORIAL.sh: line 662: kill: (1522) - No such process
>> make: *** [hadoop-0.20.205.0] Error 1
>>
>>
>> On 4/19/13 4:29 PM, "Benjamin Mahler" <be...@gmail.com>
<mailto:
>> benjamin.mahler@gmail.com<ma...@gmail.com>>> wrote:
>>
>> Brenden: It looks like Maven isn't required when building
>> hadoop-0.20.205.0, can you send a patch to fix your change to only
check
>> for Maven when building the CDH releases?
>>
>> Jim: Thanks for the report.
>>
>> I committed a recent change by Brenden here, which enforces that both
>>'ant'
>> and 'mvn' are present when building the hadoop port:
>> https://reviews.apache.org/r/10558/
>>
>>
>> On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue <jd...@adobe.com>
<mailto:
>> jdonahue@adobe.com<ma...@adobe.com>>> wrote:
>>
>> I was -- the last build I did was ten days ago.  Somebody broke the
>>build
>> scripts that I've been using for quite a while.
>>
>> Jim
>>
>>
>>
>> On 4/19/13 3:48 PM, "Benjamin Mahler" <be...@gmail.com>
<mailto:
>> benjamin.mahler@gmail.com<ma...@gmail.com>>> wrote:
>>
>> >You can fix this by installing Maven.
>> >
>> >However, I was under the assumption that we required Maven in order to
>>run
>> >the Hadoop tutorial. You were successfully building hadoop without
>>Maven
>> >installed?
>> >
>> >
>> >On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue
>><jd...@adobe.com><mailto:
>> jdonahue@adobe.com<ma...@adobe.com>>> wrote:
>> >
>> >> I'm trying to build Mesos on Amazon Linux and it appears that the
>>Hadoop
>> >> build script has changed.  It worked just fine a few days ago, but
>>now
>> >>I'm
>> >> getting:
>> >>
>> >> sudo make hadoop-0.20.205.0
>> >> if test ".." != ".."; then \
>> >> cp -p ./TUTORIAL.sh .; \
>> >> cp -p ./hadoop-gridmix.patch .; \
>> >> cp -p ./hadoop-7698-1.patch .; \
>> >> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
>> >> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
>> >> cp -p ./mapred-site.xml.patch .; \
>> >> cp -rp ./mesos .; \
>> >> cp -p ./mesos-executor .; \
>> >> fi
>> >> rm -rf hadoop-0.20.205.0
>> >> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
>> >>
>> >> We seem to be missing mvn from the path. Please install
>> >> mvn and re-run this tutorial. If you still have troubles, please
>>report
>> >> this to:
>> >>
>> >> mesos-dev@incubator.apache.org<ma...@incubator.apache.org><mailto:
mesos-dev@incubator.apache.org<ma...@incubator.apache.org>>
>> >>
>> >> (Remember to include as much debug information as possible.)
>> >>
>> >> Help, please!
>> >>
>> >>
>> >> Jim
>> >>
>>
>>
>>
>>





Re: Did something change in building hadoop-0.20.205.0???

Posted by Benjamin Mahler <be...@gmail.com>.
This was committed, as for your TUTORIAL.sh issues, can you confirm those
are consistently failing?

'make hadoop-0.20.205.0' is working correctly on my end.


On Fri, Apr 19, 2013 at 7:18 PM, Benjamin Mahler
<be...@gmail.com>wrote:

> Alright, I'm seeing a separate issue with the patch:
>
> Patch conf/hadoop-env.sh? [N]
>
>   $ patch -p1 <../hadoop-0.20.205.0_hadoop-env.sh.patch
>
> Hit enter to continue.
> patching file conf/hadoop-env.sh
> Hunk #1 FAILED at 9.
> 1 out of 1 hunk FAILED -- saving rejects to file conf/hadoop-env.sh.rej
>
> I have a fix that I will be committing shortly:
> https://reviews.apache.org/r/10668/
>
>
> On Fri, Apr 19, 2013 at 6:45 PM, Jim Donahue <jd...@adobe.com> wrote:
>
>> Yup, I tried it on both a 32-bit and 64-bit Amazon Linux instance and got
>> the same behavior.  It is possible that I inadvertently made some change
>> in my build scripts that caused it to fail, but the scripts are pretty
>> stable and I can't think of any change that I made to both (32 and 64-bit)
>> of them that would cause a problem.
>>
>> Jim
>>
>> On 4/19/13 6:39 PM, "Benjamin Mahler" <be...@gmail.com> wrote:
>>
>> >Well that is unexpected. Is that consistently failing for you?
>> >
>> >
>> >On Fri, Apr 19, 2013 at 6:29 PM, Jim Donahue <jd...@adobe.com> wrote:
>> >
>> >> Also the example seems to have stopped working Š  That's not a serious
>> >> problem (I can just ignore it) but it did work last week as far as I
>> can
>> >> remember. :-)
>> >>
>> >> Waiting 5 seconds for it to start. . . . . .
>> >> Alright, now let's run the "wordcount" example via:
>> >>
>> >> $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount
>> >> src/contrib/mesos/src/java/org/apache/hadoop/mapred out
>> >>
>> >> Hit enter to continue.
>> >> 13/04/19 23:31:54 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 0 time(s).
>> >> 13/04/19 23:31:55 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 1 time(s).
>> >> 13/04/19 23:31:56 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 2 time(s).
>> >> 13/04/19 23:31:57 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 3 time(s).
>> >> 13/04/19 23:31:58 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 4 time(s).
>> >> 13/04/19 23:31:59 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 5 time(s).
>> >> 13/04/19 23:32:00 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 6 time(s).
>> >> 13/04/19 23:32:01 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 7 time(s).
>> >> 13/04/19 23:32:02 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 8 time(s).
>> >> 13/04/19 23:32:03 INFO ipc.Client: Retrying connect to server:
>> >>localhost/
>> >> 127.0.0.1:54311. Already tried 9 time(s).
>> >> java.net.ConnectException: Call to localhost/127.0.0.1:54311 failed on
>> >> connection exception: java.net.ConnectException: Connection refused
>> >> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
>> >> at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>> >> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>> >> at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown Source)
>> >> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>> >> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>> >> at
>> org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:478)
>> >> at org.apache.hadoop.mapred.JobClient.init(JobClient.java:472)
>> >> at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:455)
>> >> at org.apache.hadoop.mapreduce.Job$1.run(Job.java:478)
>> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> at javax.security.auth.Subject.doAs(Subject.java:416)
>> >> at
>> >>
>>
>> >>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
>> >>.java:1059)
>> >> at org.apache.hadoop.mapreduce.Job.connect(Job.java:476)
>> >> at org.apache.hadoop.mapreduce.Job.submit(Job.java:464)
>> >> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
>> >> at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
>> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >> at
>> >>
>>
>> >>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>> >>:57)
>> >> at
>> >>
>>
>> >>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>> >>mpl.java:43)
>> >> at java.lang.reflect.Method.invoke(Method.java:616)
>> >> at
>> >>
>>
>> >>org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDri
>> >>ver.java:68)
>> >> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>> >> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >> at
>> >>
>>
>> >>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>> >>:57)
>> >> at
>> >>
>>
>> >>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>> >>mpl.java:43)
>> >> at java.lang.reflect.Method.invoke(Method.java:616)
>> >> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> >> Caused by: java.net.ConnectException: Connection refused
>> >> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> >> at
>> >>sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:592)
>> >> at
>> >>
>>
>> >>org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.jav
>> >>a:206)
>> >> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:604)
>> >> at
>> >>org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
>> >> at
>> >>org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
>> >> at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
>> >> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
>> >> at org.apache.hadoop.ipc.Client.call(Client.java:1046)
>> >> ... 27 more
>> >>
>> >> Oh no, it failed! Try running the JobTracker and wordcount
>> >> example manually ... it might be an issue with your environment that
>> >> this tutorial didn't cover (if you find this to be the case, please
>> >> create a JIRA for us and/or send us a code review).
>> >>
>> >> ./TUTORIAL.sh: line 662: kill: (1522) - No such process
>> >> make: *** [hadoop-0.20.205.0] Error 1
>> >>
>> >>
>> >> On 4/19/13 4:29 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com
>> <mailto:
>> >> benjamin.mahler@gmail.com>> wrote:
>> >>
>> >> Brenden: It looks like Maven isn't required when building
>> >> hadoop-0.20.205.0, can you send a patch to fix your change to only
>> check
>> >> for Maven when building the CDH releases?
>> >>
>> >> Jim: Thanks for the report.
>> >>
>> >> I committed a recent change by Brenden here, which enforces that both
>> >>'ant'
>> >> and 'mvn' are present when building the hadoop port:
>> >> https://reviews.apache.org/r/10558/
>> >>
>> >>
>> >> On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue <jdonahue@adobe.com
>> <mailto:
>> >> jdonahue@adobe.com>> wrote:
>> >>
>> >> I was -- the last build I did was ten days ago.  Somebody broke the
>> >>build
>> >> scripts that I've been using for quite a while.
>> >>
>> >> Jim
>> >>
>> >>
>> >>
>> >> On 4/19/13 3:48 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com
>> <mailto:
>> >> benjamin.mahler@gmail.com>> wrote:
>> >>
>> >> >You can fix this by installing Maven.
>> >> >
>> >> >However, I was under the assumption that we required Maven in order to
>> >>run
>> >> >the Hadoop tutorial. You were successfully building hadoop without
>> >>Maven
>> >> >installed?
>> >> >
>> >> >
>> >> >On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue
>> >><jdonahue@adobe.com<mailto:
>> >> jdonahue@adobe.com>> wrote:
>> >> >
>> >> >> I'm trying to build Mesos on Amazon Linux and it appears that the
>> >>Hadoop
>> >> >> build script has changed.  It worked just fine a few days ago, but
>> >>now
>> >> >>I'm
>> >> >> getting:
>> >> >>
>> >> >> sudo make hadoop-0.20.205.0
>> >> >> if test ".." != ".."; then \
>> >> >> cp -p ./TUTORIAL.sh .; \
>> >> >> cp -p ./hadoop-gridmix.patch .; \
>> >> >> cp -p ./hadoop-7698-1.patch .; \
>> >> >> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
>> >> >> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
>> >> >> cp -p ./mapred-site.xml.patch .; \
>> >> >> cp -rp ./mesos .; \
>> >> >> cp -p ./mesos-executor .; \
>> >> >> fi
>> >> >> rm -rf hadoop-0.20.205.0
>> >> >> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
>> >> >>
>> >> >> We seem to be missing mvn from the path. Please install
>> >> >> mvn and re-run this tutorial. If you still have troubles, please
>> >>report
>> >> >> this to:
>> >> >>
>> >> >> mesos-dev@incubator.apache.org<mailto:
>> mesos-dev@incubator.apache.org>
>> >> >>
>> >> >> (Remember to include as much debug information as possible.)
>> >> >>
>> >> >> Help, please!
>> >> >>
>> >> >>
>> >> >> Jim
>> >> >>
>> >>
>> >>
>> >>
>> >>
>>
>>
>

Re: Did something change in building hadoop-0.20.205.0???

Posted by Benjamin Mahler <be...@gmail.com>.
Alright, I'm seeing a separate issue with the patch:

Patch conf/hadoop-env.sh? [N]

  $ patch -p1 <../hadoop-0.20.205.0_hadoop-env.sh.patch

Hit enter to continue.
patching file conf/hadoop-env.sh
Hunk #1 FAILED at 9.
1 out of 1 hunk FAILED -- saving rejects to file conf/hadoop-env.sh.rej

I have a fix that I will be committing shortly:
https://reviews.apache.org/r/10668/


On Fri, Apr 19, 2013 at 6:45 PM, Jim Donahue <jd...@adobe.com> wrote:

> Yup, I tried it on both a 32-bit and 64-bit Amazon Linux instance and got
> the same behavior.  It is possible that I inadvertently made some change
> in my build scripts that caused it to fail, but the scripts are pretty
> stable and I can't think of any change that I made to both (32 and 64-bit)
> of them that would cause a problem.
>
> Jim
>
> On 4/19/13 6:39 PM, "Benjamin Mahler" <be...@gmail.com> wrote:
>
> >Well that is unexpected. Is that consistently failing for you?
> >
> >
> >On Fri, Apr 19, 2013 at 6:29 PM, Jim Donahue <jd...@adobe.com> wrote:
> >
> >> Also the example seems to have stopped working Š  That's not a serious
> >> problem (I can just ignore it) but it did work last week as far as I can
> >> remember. :-)
> >>
> >> Waiting 5 seconds for it to start. . . . . .
> >> Alright, now let's run the "wordcount" example via:
> >>
> >> $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount
> >> src/contrib/mesos/src/java/org/apache/hadoop/mapred out
> >>
> >> Hit enter to continue.
> >> 13/04/19 23:31:54 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 0 time(s).
> >> 13/04/19 23:31:55 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 1 time(s).
> >> 13/04/19 23:31:56 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 2 time(s).
> >> 13/04/19 23:31:57 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 3 time(s).
> >> 13/04/19 23:31:58 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 4 time(s).
> >> 13/04/19 23:31:59 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 5 time(s).
> >> 13/04/19 23:32:00 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 6 time(s).
> >> 13/04/19 23:32:01 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 7 time(s).
> >> 13/04/19 23:32:02 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 8 time(s).
> >> 13/04/19 23:32:03 INFO ipc.Client: Retrying connect to server:
> >>localhost/
> >> 127.0.0.1:54311. Already tried 9 time(s).
> >> java.net.ConnectException: Call to localhost/127.0.0.1:54311 failed on
> >> connection exception: java.net.ConnectException: Connection refused
> >> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
> >> at org.apache.hadoop.ipc.Client.call(Client.java:1071)
> >> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
> >> at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown Source)
> >> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
> >> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
> >> at org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:478)
> >> at org.apache.hadoop.mapred.JobClient.init(JobClient.java:472)
> >> at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:455)
> >> at org.apache.hadoop.mapreduce.Job$1.run(Job.java:478)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at javax.security.auth.Subject.doAs(Subject.java:416)
> >> at
> >>
> >>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
> >>.java:1059)
> >> at org.apache.hadoop.mapreduce.Job.connect(Job.java:476)
> >> at org.apache.hadoop.mapreduce.Job.submit(Job.java:464)
> >> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> >> at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> >>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> >>:57)
> >> at
> >>
> >>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
> >>mpl.java:43)
> >> at java.lang.reflect.Method.invoke(Method.java:616)
> >> at
> >>
> >>org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDri
> >>ver.java:68)
> >> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> >>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> >>:57)
> >> at
> >>
> >>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
> >>mpl.java:43)
> >> at java.lang.reflect.Method.invoke(Method.java:616)
> >> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >> Caused by: java.net.ConnectException: Connection refused
> >> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> >> at
> >>sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:592)
> >> at
> >>
> >>org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.jav
> >>a:206)
> >> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:604)
> >> at
> >>org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
> >> at
> >>org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
> >> at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
> >> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
> >> at org.apache.hadoop.ipc.Client.call(Client.java:1046)
> >> ... 27 more
> >>
> >> Oh no, it failed! Try running the JobTracker and wordcount
> >> example manually ... it might be an issue with your environment that
> >> this tutorial didn't cover (if you find this to be the case, please
> >> create a JIRA for us and/or send us a code review).
> >>
> >> ./TUTORIAL.sh: line 662: kill: (1522) - No such process
> >> make: *** [hadoop-0.20.205.0] Error 1
> >>
> >>
> >> On 4/19/13 4:29 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com
> <mailto:
> >> benjamin.mahler@gmail.com>> wrote:
> >>
> >> Brenden: It looks like Maven isn't required when building
> >> hadoop-0.20.205.0, can you send a patch to fix your change to only check
> >> for Maven when building the CDH releases?
> >>
> >> Jim: Thanks for the report.
> >>
> >> I committed a recent change by Brenden here, which enforces that both
> >>'ant'
> >> and 'mvn' are present when building the hadoop port:
> >> https://reviews.apache.org/r/10558/
> >>
> >>
> >> On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue <jdonahue@adobe.com
> <mailto:
> >> jdonahue@adobe.com>> wrote:
> >>
> >> I was -- the last build I did was ten days ago.  Somebody broke the
> >>build
> >> scripts that I've been using for quite a while.
> >>
> >> Jim
> >>
> >>
> >>
> >> On 4/19/13 3:48 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com
> <mailto:
> >> benjamin.mahler@gmail.com>> wrote:
> >>
> >> >You can fix this by installing Maven.
> >> >
> >> >However, I was under the assumption that we required Maven in order to
> >>run
> >> >the Hadoop tutorial. You were successfully building hadoop without
> >>Maven
> >> >installed?
> >> >
> >> >
> >> >On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue
> >><jdonahue@adobe.com<mailto:
> >> jdonahue@adobe.com>> wrote:
> >> >
> >> >> I'm trying to build Mesos on Amazon Linux and it appears that the
> >>Hadoop
> >> >> build script has changed.  It worked just fine a few days ago, but
> >>now
> >> >>I'm
> >> >> getting:
> >> >>
> >> >> sudo make hadoop-0.20.205.0
> >> >> if test ".." != ".."; then \
> >> >> cp -p ./TUTORIAL.sh .; \
> >> >> cp -p ./hadoop-gridmix.patch .; \
> >> >> cp -p ./hadoop-7698-1.patch .; \
> >> >> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
> >> >> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
> >> >> cp -p ./mapred-site.xml.patch .; \
> >> >> cp -rp ./mesos .; \
> >> >> cp -p ./mesos-executor .; \
> >> >> fi
> >> >> rm -rf hadoop-0.20.205.0
> >> >> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
> >> >>
> >> >> We seem to be missing mvn from the path. Please install
> >> >> mvn and re-run this tutorial. If you still have troubles, please
> >>report
> >> >> this to:
> >> >>
> >> >> mesos-dev@incubator.apache.org<mailto:mesos-dev@incubator.apache.org
> >
> >> >>
> >> >> (Remember to include as much debug information as possible.)
> >> >>
> >> >> Help, please!
> >> >>
> >> >>
> >> >> Jim
> >> >>
> >>
> >>
> >>
> >>
>
>

Re: Did something change in building hadoop-0.20.205.0???

Posted by Jim Donahue <jd...@adobe.com>.
Yup, I tried it on both a 32-bit and 64-bit Amazon Linux instance and got
the same behavior.  It is possible that I inadvertently made some change
in my build scripts that caused it to fail, but the scripts are pretty
stable and I can't think of any change that I made to both (32 and 64-bit)
of them that would cause a problem.

Jim

On 4/19/13 6:39 PM, "Benjamin Mahler" <be...@gmail.com> wrote:

>Well that is unexpected. Is that consistently failing for you?
>
>
>On Fri, Apr 19, 2013 at 6:29 PM, Jim Donahue <jd...@adobe.com> wrote:
>
>> Also the example seems to have stopped working Š  That's not a serious
>> problem (I can just ignore it) but it did work last week as far as I can
>> remember. :-)
>>
>> Waiting 5 seconds for it to start. . . . . .
>> Alright, now let's run the "wordcount" example via:
>>
>> $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount
>> src/contrib/mesos/src/java/org/apache/hadoop/mapred out
>>
>> Hit enter to continue.
>> 13/04/19 23:31:54 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 0 time(s).
>> 13/04/19 23:31:55 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 1 time(s).
>> 13/04/19 23:31:56 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 2 time(s).
>> 13/04/19 23:31:57 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 3 time(s).
>> 13/04/19 23:31:58 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 4 time(s).
>> 13/04/19 23:31:59 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 5 time(s).
>> 13/04/19 23:32:00 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 6 time(s).
>> 13/04/19 23:32:01 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 7 time(s).
>> 13/04/19 23:32:02 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 8 time(s).
>> 13/04/19 23:32:03 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 9 time(s).
>> java.net.ConnectException: Call to localhost/127.0.0.1:54311 failed on
>> connection exception: java.net.ConnectException: Connection refused
>> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>> at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>> at org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:478)
>> at org.apache.hadoop.mapred.JobClient.init(JobClient.java:472)
>> at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:455)
>> at org.apache.hadoop.mapreduce.Job$1.run(Job.java:478)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:416)
>> at
>> 
>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
>>.java:1059)
>> at org.apache.hadoop.mapreduce.Job.connect(Job.java:476)
>> at org.apache.hadoop.mapreduce.Job.submit(Job.java:464)
>> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
>> at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> 
>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>>:57)
>> at
>> 
>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>>mpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at
>> 
>>org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDri
>>ver.java:68)
>> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> 
>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>>:57)
>> at
>> 
>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>>mpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at 
>>sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:592)
>> at
>> 
>>org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.jav
>>a:206)
>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:604)
>> at 
>>org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
>> at 
>>org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
>> at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1046)
>> ... 27 more
>>
>> Oh no, it failed! Try running the JobTracker and wordcount
>> example manually ... it might be an issue with your environment that
>> this tutorial didn't cover (if you find this to be the case, please
>> create a JIRA for us and/or send us a code review).
>>
>> ./TUTORIAL.sh: line 662: kill: (1522) - No such process
>> make: *** [hadoop-0.20.205.0] Error 1
>>
>>
>> On 4/19/13 4:29 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com<mailto:
>> benjamin.mahler@gmail.com>> wrote:
>>
>> Brenden: It looks like Maven isn't required when building
>> hadoop-0.20.205.0, can you send a patch to fix your change to only check
>> for Maven when building the CDH releases?
>>
>> Jim: Thanks for the report.
>>
>> I committed a recent change by Brenden here, which enforces that both
>>'ant'
>> and 'mvn' are present when building the hadoop port:
>> https://reviews.apache.org/r/10558/
>>
>>
>> On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue <jdonahue@adobe.com<mailto:
>> jdonahue@adobe.com>> wrote:
>>
>> I was -- the last build I did was ten days ago.  Somebody broke the
>>build
>> scripts that I've been using for quite a while.
>>
>> Jim
>>
>>
>>
>> On 4/19/13 3:48 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com<mailto:
>> benjamin.mahler@gmail.com>> wrote:
>>
>> >You can fix this by installing Maven.
>> >
>> >However, I was under the assumption that we required Maven in order to
>>run
>> >the Hadoop tutorial. You were successfully building hadoop without
>>Maven
>> >installed?
>> >
>> >
>> >On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue
>><jdonahue@adobe.com<mailto:
>> jdonahue@adobe.com>> wrote:
>> >
>> >> I'm trying to build Mesos on Amazon Linux and it appears that the
>>Hadoop
>> >> build script has changed.  It worked just fine a few days ago, but
>>now
>> >>I'm
>> >> getting:
>> >>
>> >> sudo make hadoop-0.20.205.0
>> >> if test ".." != ".."; then \
>> >> cp -p ./TUTORIAL.sh .; \
>> >> cp -p ./hadoop-gridmix.patch .; \
>> >> cp -p ./hadoop-7698-1.patch .; \
>> >> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
>> >> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
>> >> cp -p ./mapred-site.xml.patch .; \
>> >> cp -rp ./mesos .; \
>> >> cp -p ./mesos-executor .; \
>> >> fi
>> >> rm -rf hadoop-0.20.205.0
>> >> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
>> >>
>> >> We seem to be missing mvn from the path. Please install
>> >> mvn and re-run this tutorial. If you still have troubles, please
>>report
>> >> this to:
>> >>
>> >> mesos-dev@incubator.apache.org<ma...@incubator.apache.org>
>> >>
>> >> (Remember to include as much debug information as possible.)
>> >>
>> >> Help, please!
>> >>
>> >>
>> >> Jim
>> >>
>>
>>
>>
>>


Re: Did something change in building hadoop-0.20.205.0???

Posted by Benjamin Mahler <be...@gmail.com>.
Well that is unexpected. Is that consistently failing for you?


On Fri, Apr 19, 2013 at 6:29 PM, Jim Donahue <jd...@adobe.com> wrote:

> Also the example seems to have stopped working …  That's not a serious
> problem (I can just ignore it) but it did work last week as far as I can
> remember. :-)
>
> Waiting 5 seconds for it to start. . . . . .
> Alright, now let's run the "wordcount" example via:
>
> $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount
> src/contrib/mesos/src/java/org/apache/hadoop/mapred out
>
> Hit enter to continue.
> 13/04/19 23:31:54 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 0 time(s).
> 13/04/19 23:31:55 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 1 time(s).
> 13/04/19 23:31:56 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 2 time(s).
> 13/04/19 23:31:57 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 3 time(s).
> 13/04/19 23:31:58 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 4 time(s).
> 13/04/19 23:31:59 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 5 time(s).
> 13/04/19 23:32:00 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 6 time(s).
> 13/04/19 23:32:01 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 7 time(s).
> 13/04/19 23:32:02 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 8 time(s).
> 13/04/19 23:32:03 INFO ipc.Client: Retrying connect to server: localhost/
> 127.0.0.1:54311. Already tried 9 time(s).
> java.net.ConnectException: Call to localhost/127.0.0.1:54311 failed on
> connection exception: java.net.ConnectException: Connection refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
> at org.apache.hadoop.ipc.Client.call(Client.java:1071)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
> at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
> at org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:478)
> at org.apache.hadoop.mapred.JobClient.init(JobClient.java:472)
> at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:455)
> at org.apache.hadoop.mapreduce.Job$1.run(Job.java:478)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
> at org.apache.hadoop.mapreduce.Job.connect(Job.java:476)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:464)
> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:592)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:604)
> at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
> at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
> at org.apache.hadoop.ipc.Client.call(Client.java:1046)
> ... 27 more
>
> Oh no, it failed! Try running the JobTracker and wordcount
> example manually ... it might be an issue with your environment that
> this tutorial didn't cover (if you find this to be the case, please
> create a JIRA for us and/or send us a code review).
>
> ./TUTORIAL.sh: line 662: kill: (1522) - No such process
> make: *** [hadoop-0.20.205.0] Error 1
>
>
> On 4/19/13 4:29 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com<mailto:
> benjamin.mahler@gmail.com>> wrote:
>
> Brenden: It looks like Maven isn't required when building
> hadoop-0.20.205.0, can you send a patch to fix your change to only check
> for Maven when building the CDH releases?
>
> Jim: Thanks for the report.
>
> I committed a recent change by Brenden here, which enforces that both 'ant'
> and 'mvn' are present when building the hadoop port:
> https://reviews.apache.org/r/10558/
>
>
> On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue <jdonahue@adobe.com<mailto:
> jdonahue@adobe.com>> wrote:
>
> I was -- the last build I did was ten days ago.  Somebody broke the build
> scripts that I've been using for quite a while.
>
> Jim
>
>
>
> On 4/19/13 3:48 PM, "Benjamin Mahler" <benjamin.mahler@gmail.com<mailto:
> benjamin.mahler@gmail.com>> wrote:
>
> >You can fix this by installing Maven.
> >
> >However, I was under the assumption that we required Maven in order to run
> >the Hadoop tutorial. You were successfully building hadoop without Maven
> >installed?
> >
> >
> >On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue <jdonahue@adobe.com<mailto:
> jdonahue@adobe.com>> wrote:
> >
> >> I'm trying to build Mesos on Amazon Linux and it appears that the Hadoop
> >> build script has changed.  It worked just fine a few days ago, but now
> >>I'm
> >> getting:
> >>
> >> sudo make hadoop-0.20.205.0
> >> if test ".." != ".."; then \
> >> cp -p ./TUTORIAL.sh .; \
> >> cp -p ./hadoop-gridmix.patch .; \
> >> cp -p ./hadoop-7698-1.patch .; \
> >> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
> >> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
> >> cp -p ./mapred-site.xml.patch .; \
> >> cp -rp ./mesos .; \
> >> cp -p ./mesos-executor .; \
> >> fi
> >> rm -rf hadoop-0.20.205.0
> >> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
> >>
> >> We seem to be missing mvn from the path. Please install
> >> mvn and re-run this tutorial. If you still have troubles, please report
> >> this to:
> >>
> >> mesos-dev@incubator.apache.org<ma...@incubator.apache.org>
> >>
> >> (Remember to include as much debug information as possible.)
> >>
> >> Help, please!
> >>
> >>
> >> Jim
> >>
>
>
>
>

Re: Did something change in building hadoop-0.20.205.0???

Posted by Jim Donahue <jd...@adobe.com>.
Also the example seems to have stopped working …  That's not a serious problem (I can just ignore it) but it did work last week as far as I can remember. :-)

Waiting 5 seconds for it to start. . . . . .
Alright, now let's run the "wordcount" example via:

$ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount src/contrib/mesos/src/java/org/apache/hadoop/mapred out

Hit enter to continue.
13/04/19 23:31:54 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 0 time(s).
13/04/19 23:31:55 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 1 time(s).
13/04/19 23:31:56 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 2 time(s).
13/04/19 23:31:57 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 3 time(s).
13/04/19 23:31:58 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 4 time(s).
13/04/19 23:31:59 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 5 time(s).
13/04/19 23:32:00 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 6 time(s).
13/04/19 23:32:01 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 7 time(s).
13/04/19 23:32:02 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 8 time(s).
13/04/19 23:32:03 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54311. Already tried 9 time(s).
java.net.ConnectException: Call to localhost/127.0.0.1:54311 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
at org.apache.hadoop.ipc.Client.call(Client.java:1071)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:478)
at org.apache.hadoop.mapred.JobClient.init(JobClient.java:472)
at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:455)
at org.apache.hadoop.mapreduce.Job$1.run(Job.java:478)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:476)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:464)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:592)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:604)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
at org.apache.hadoop.ipc.Client.call(Client.java:1046)
... 27 more

Oh no, it failed! Try running the JobTracker and wordcount
example manually ... it might be an issue with your environment that
this tutorial didn't cover (if you find this to be the case, please
create a JIRA for us and/or send us a code review).

./TUTORIAL.sh: line 662: kill: (1522) - No such process
make: *** [hadoop-0.20.205.0] Error 1


On 4/19/13 4:29 PM, "Benjamin Mahler" <be...@gmail.com>> wrote:

Brenden: It looks like Maven isn't required when building
hadoop-0.20.205.0, can you send a patch to fix your change to only check
for Maven when building the CDH releases?

Jim: Thanks for the report.

I committed a recent change by Brenden here, which enforces that both 'ant'
and 'mvn' are present when building the hadoop port:
https://reviews.apache.org/r/10558/


On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue <jd...@adobe.com>> wrote:

I was -- the last build I did was ten days ago.  Somebody broke the build
scripts that I've been using for quite a while.

Jim



On 4/19/13 3:48 PM, "Benjamin Mahler" <be...@gmail.com>> wrote:

>You can fix this by installing Maven.
>
>However, I was under the assumption that we required Maven in order to run
>the Hadoop tutorial. You were successfully building hadoop without Maven
>installed?
>
>
>On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue <jd...@adobe.com>> wrote:
>
>> I'm trying to build Mesos on Amazon Linux and it appears that the Hadoop
>> build script has changed.  It worked just fine a few days ago, but now
>>I'm
>> getting:
>>
>> sudo make hadoop-0.20.205.0
>> if test ".." != ".."; then \
>> cp -p ./TUTORIAL.sh .; \
>> cp -p ./hadoop-gridmix.patch .; \
>> cp -p ./hadoop-7698-1.patch .; \
>> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
>> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
>> cp -p ./mapred-site.xml.patch .; \
>> cp -rp ./mesos .; \
>> cp -p ./mesos-executor .; \
>> fi
>> rm -rf hadoop-0.20.205.0
>> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
>>
>> We seem to be missing mvn from the path. Please install
>> mvn and re-run this tutorial. If you still have troubles, please report
>> this to:
>>
>> mesos-dev@incubator.apache.org<ma...@incubator.apache.org>
>>
>> (Remember to include as much debug information as possible.)
>>
>> Help, please!
>>
>>
>> Jim
>>




Re: Did something change in building hadoop-0.20.205.0???

Posted by Benjamin Mahler <be...@gmail.com>.
Brenden: It looks like Maven isn't required when building
hadoop-0.20.205.0, can you send a patch to fix your change to only check
for Maven when building the CDH releases?

Jim: Thanks for the report.

I committed a recent change by Brenden here, which enforces that both 'ant'
and 'mvn' are present when building the hadoop port:
https://reviews.apache.org/r/10558/


On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue <jd...@adobe.com> wrote:

> I was -- the last build I did was ten days ago.  Somebody broke the build
> scripts that I've been using for quite a while.
>
> Jim
>
>
>
> On 4/19/13 3:48 PM, "Benjamin Mahler" <be...@gmail.com> wrote:
>
> >You can fix this by installing Maven.
> >
> >However, I was under the assumption that we required Maven in order to run
> >the Hadoop tutorial. You were successfully building hadoop without Maven
> >installed?
> >
> >
> >On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue <jd...@adobe.com> wrote:
> >
> >> I'm trying to build Mesos on Amazon Linux and it appears that the Hadoop
> >> build script has changed.  It worked just fine a few days ago, but now
> >>I'm
> >> getting:
> >>
> >> sudo make hadoop-0.20.205.0
> >> if test ".." != ".."; then \
> >> cp -p ./TUTORIAL.sh .; \
> >> cp -p ./hadoop-gridmix.patch .; \
> >> cp -p ./hadoop-7698-1.patch .; \
> >> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
> >> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
> >> cp -p ./mapred-site.xml.patch .; \
> >> cp -rp ./mesos .; \
> >> cp -p ./mesos-executor .; \
> >> fi
> >> rm -rf hadoop-0.20.205.0
> >> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
> >>
> >> We seem to be missing mvn from the path. Please install
> >> mvn and re-run this tutorial. If you still have troubles, please report
> >> this to:
> >>
> >> mesos-dev@incubator.apache.org
> >>
> >> (Remember to include as much debug information as possible.)
> >>
> >> Help, please!
> >>
> >>
> >> Jim
> >>
>
>

Re: Did something change in building hadoop-0.20.205.0???

Posted by Jim Donahue <jd...@adobe.com>.
I was -- the last build I did was ten days ago.  Somebody broke the build
scripts that I've been using for quite a while.

Jim



On 4/19/13 3:48 PM, "Benjamin Mahler" <be...@gmail.com> wrote:

>You can fix this by installing Maven.
>
>However, I was under the assumption that we required Maven in order to run
>the Hadoop tutorial. You were successfully building hadoop without Maven
>installed?
>
>
>On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue <jd...@adobe.com> wrote:
>
>> I'm trying to build Mesos on Amazon Linux and it appears that the Hadoop
>> build script has changed.  It worked just fine a few days ago, but now
>>I'm
>> getting:
>>
>> sudo make hadoop-0.20.205.0
>> if test ".." != ".."; then \
>> cp -p ./TUTORIAL.sh .; \
>> cp -p ./hadoop-gridmix.patch .; \
>> cp -p ./hadoop-7698-1.patch .; \
>> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
>> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
>> cp -p ./mapred-site.xml.patch .; \
>> cp -rp ./mesos .; \
>> cp -p ./mesos-executor .; \
>> fi
>> rm -rf hadoop-0.20.205.0
>> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
>>
>> We seem to be missing mvn from the path. Please install
>> mvn and re-run this tutorial. If you still have troubles, please report
>> this to:
>>
>> mesos-dev@incubator.apache.org
>>
>> (Remember to include as much debug information as possible.)
>>
>> Help, please!
>>
>>
>> Jim
>>


Re: Did something change in building hadoop-0.20.205.0???

Posted by Benjamin Mahler <be...@gmail.com>.
You can fix this by installing Maven.

However, I was under the assumption that we required Maven in order to run
the Hadoop tutorial. You were successfully building hadoop without Maven
installed?


On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue <jd...@adobe.com> wrote:

> I'm trying to build Mesos on Amazon Linux and it appears that the Hadoop
> build script has changed.  It worked just fine a few days ago, but now I'm
> getting:
>
> sudo make hadoop-0.20.205.0
> if test ".." != ".."; then \
> cp -p ./TUTORIAL.sh .; \
> cp -p ./hadoop-gridmix.patch .; \
> cp -p ./hadoop-7698-1.patch .; \
> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
> cp -p ./mapred-site.xml.patch .; \
> cp -rp ./mesos .; \
> cp -p ./mesos-executor .; \
> fi
> rm -rf hadoop-0.20.205.0
> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
>
> We seem to be missing mvn from the path. Please install
> mvn and re-run this tutorial. If you still have troubles, please report
> this to:
>
> mesos-dev@incubator.apache.org
>
> (Remember to include as much debug information as possible.)
>
> Help, please!
>
>
> Jim
>

Did something change in building hadoop-0.20.205.0???

Posted by Jim Donahue <jd...@adobe.com>.
I'm trying to build Mesos on Amazon Linux and it appears that the Hadoop build script has changed.  It worked just fine a few days ago, but now I'm getting:

sudo make hadoop-0.20.205.0
if test ".." != ".."; then \
cp -p ./TUTORIAL.sh .; \
cp -p ./hadoop-gridmix.patch .; \
cp -p ./hadoop-7698-1.patch .; \
cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
cp -p ./hadoop-0.20.205.0_mesos.patch .; \
cp -p ./mapred-site.xml.patch .; \
cp -rp ./mesos .; \
cp -p ./mesos-executor .; \
fi
rm -rf hadoop-0.20.205.0
which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)

We seem to be missing mvn from the path. Please install
mvn and re-run this tutorial. If you still have troubles, please report
this to:

mesos-dev@incubator.apache.org

(Remember to include as much debug information as possible.)

Help, please!


Jim

Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Ben Mahler <be...@gmail.com>.

> On April 17, 2013, 5:53 p.m., Ben Mahler wrote:
> > Can you update your 'Testing Done' do indicate whether or not the tutorial works locally?
> 
> Brenden Matthews wrote:
>     Yes it works locally once all of my patches are applied.

I see, will it work if committed in isolation? If not, what needs to be committed along with this change?


- Ben


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/#review19337
-----------------------------------------------------------


On April 16, 2013, 10 p.m., Brenden Matthews wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
> 
> (Updated April 16, 2013, 10 p.m.)
> 
> 
> Review request for mesos.
> 
> 
> Description
> -------
> 
> Run Hadoop tutorial binaries from within build dir.
> 
> 
> Diffs
> -----
> 
>   hadoop/TUTORIAL.sh f8131cd 
> 
> Diff: https://reviews.apache.org/r/10492/diff/
> 
> 
> Testing
> -------
> 
> Used in production at Airbnb.
> 
> 
> Thanks,
> 
> Brenden Matthews
> 
>


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Brenden Matthews <br...@diddyinc.com>.

> On April 17, 2013, 5:53 p.m., Ben Mahler wrote:
> > Can you update your 'Testing Done' do indicate whether or not the tutorial works locally?
> 
> Brenden Matthews wrote:
>     Yes it works locally once all of my patches are applied.
> 
> Ben Mahler wrote:
>     I see, will it work if committed in isolation? If not, what needs to be committed along with this change?

For me, the last required one is https://reviews.apache.org/r/10563/


- Brenden


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/#review19337
-----------------------------------------------------------


On April 16, 2013, 10 p.m., Brenden Matthews wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
> 
> (Updated April 16, 2013, 10 p.m.)
> 
> 
> Review request for mesos.
> 
> 
> Description
> -------
> 
> Run Hadoop tutorial binaries from within build dir.
> 
> 
> Diffs
> -----
> 
>   hadoop/TUTORIAL.sh f8131cd 
> 
> Diff: https://reviews.apache.org/r/10492/diff/
> 
> 
> Testing
> -------
> 
> Used in production at Airbnb.
> 
> 
> Thanks,
> 
> Brenden Matthews
> 
>


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Brenden Matthews <br...@diddyinc.com>.

> On April 17, 2013, 5:53 p.m., Ben Mahler wrote:
> > Can you update your 'Testing Done' do indicate whether or not the tutorial works locally?

Yes it works locally once all of my patches are applied.


- Brenden


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/#review19337
-----------------------------------------------------------


On April 16, 2013, 10 p.m., Brenden Matthews wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
> 
> (Updated April 16, 2013, 10 p.m.)
> 
> 
> Review request for mesos.
> 
> 
> Description
> -------
> 
> Run Hadoop tutorial binaries from within build dir.
> 
> 
> Diffs
> -----
> 
>   hadoop/TUTORIAL.sh f8131cd 
> 
> Diff: https://reviews.apache.org/r/10492/diff/
> 
> 
> Testing
> -------
> 
> Used in production at Airbnb.
> 
> 
> Thanks,
> 
> Brenden Matthews
> 
>


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Ben Mahler <be...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/#review19337
-----------------------------------------------------------


Can you update your 'Testing Done' do indicate whether or not the tutorial works locally?

- Ben Mahler


On April 16, 2013, 10 p.m., Brenden Matthews wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
> 
> (Updated April 16, 2013, 10 p.m.)
> 
> 
> Review request for mesos.
> 
> 
> Description
> -------
> 
> Run Hadoop tutorial binaries from within build dir.
> 
> 
> Diffs
> -----
> 
>   hadoop/TUTORIAL.sh f8131cd 
> 
> Diff: https://reviews.apache.org/r/10492/diff/
> 
> 
> Testing
> -------
> 
> Used in production at Airbnb.
> 
> 
> Thanks,
> 
> Brenden Matthews
> 
>


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Vinod Kone <vi...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/#review19287
-----------------------------------------------------------

Ship it!


Ship It!

- Vinod Kone


On April 16, 2013, 10 p.m., Brenden Matthews wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
> 
> (Updated April 16, 2013, 10 p.m.)
> 
> 
> Review request for mesos.
> 
> 
> Description
> -------
> 
> Run Hadoop tutorial binaries from within build dir.
> 
> 
> Diffs
> -----
> 
>   hadoop/TUTORIAL.sh f8131cd 
> 
> Diff: https://reviews.apache.org/r/10492/diff/
> 
> 
> Testing
> -------
> 
> Used in production at Airbnb.
> 
> 
> Thanks,
> 
> Brenden Matthews
> 
>


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Brenden Matthews <br...@diddyinc.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/10492/
-----------------------------------------------------------

(Updated April 16, 2013, 10 p.m.)


Review request for mesos.


Changes
-------

I can confirm that this now works.


Description
-------

Run Hadoop tutorial binaries from within build dir.


Diffs (updated)
-----

  hadoop/TUTORIAL.sh f8131cd 

Diff: https://reviews.apache.org/r/10492/diff/


Testing
-------

Used in production at Airbnb.


Thanks,

Brenden Matthews


Re: Review Request: Run Hadoop tutorial binaries from within build dir.

Posted by Vinod Kone <vi...@twitter.com>.
I can't open this diff?

The file 'f8131cd' could not be found in the repository


Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/diffviewer/views.py",
line 254, in view_diff
    populate_diff_chunks(temp_files, highlighting)
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/diffviewer/diffutils.py",
line 1182, in populate_diff_chunks
    large_data=True)
  File "/usr/local/lib/python2.7/dist-packages/Djblets-0.6.23-py2.7.egg/djblets/util/misc.py",
line 156, in cache_memoize
    data = lookup_callable()
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/diffviewer/diffutils.py",
line 1181, in <lambda>
    enable_syntax_highlighting)),
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/diffviewer/diffutils.py",
line 588, in get_chunks
    old = get_original_file(filediff)
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/diffviewer/diffutils.py",
line 368, in get_original_file
    large_data=True)[0]
  File "/usr/local/lib/python2.7/dist-packages/Djblets-0.6.23-py2.7.egg/djblets/util/misc.py",
line 156, in cache_memoize
    data = lookup_callable()
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/diffviewer/diffutils.py",
line 367, in <lambda>
    data = cache_memoize(key, lambda: [fetch_file(file, revision)],
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/diffviewer/diffutils.py",
line 347, in fetch_file
    data = repository.get_file(file, revision)
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/scmtools/models.py",
line 155, in get_file
    return self.get_scmtool().get_file(path, revision)
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/scmtools/git.py",
line 75, in get_file
    return self.client.get_file(path, revision)
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/scmtools/git.py",
line 383, in get_file
    return self._cat_file(path, revision, "blob")
  File "/usr/local/lib/python2.7/dist-packages/ReviewBoard-1.6.12-py2.7.egg/reviewboard/scmtools/git.py",
line 438, in _cat_file
    raise FileNotFoundError(commit)
FileNotFoundError: The file 'f8131cd' could not be found in the repository



@vinodkone


On Mon, Apr 15, 2013 at 2:21 PM, Brenden Matthews <br...@diddyinc.com>wrote:

>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/10492/
> -----------------------------------------------------------
>
> Review request for mesos.
>
>
> Description
> -------
>
> Run Hadoop tutorial binaries from within build dir.
>
>
> Diffs
> -----
>
>   hadoop/TUTORIAL.sh f8131cd
>
> Diff: https://reviews.apache.org/r/10492/diff/
>
>
> Testing
> -------
>
> Used in production at Airbnb.
>
>
> Thanks,
>
> Brenden Matthews
>
>