You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mesos.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2013/05/22 16:38:07 UTC

Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #14

See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/14/changes>

Changes:

[vinod] Fixed LICENSE and NOTICE.

[vinod] Updated CHANGELOG for 0.11.0.

[vinod] Fixed CHANGELOG to inclue 0.11.0 release notes.

------------------------------------------
[...truncated 2659 lines...]
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT local/mesos_local-main.o -MD -MP -MF local/.deps/mesos_local-main.Tpo -c -o local/mesos_local-main.o `test -f 'local/main.cpp' || echo '../../src/'`local/main.cpp
mv -f local/.deps/mesos_local-main.Tpo local/.deps/mesos_local-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-local local/mesos_local-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-local local/mesos_local-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT log/mesos_log-main.o -MD -MP -MF log/.deps/mesos_log-main.Tpo -c -o log/mesos_log-main.o `test -f 'log/main.cpp' || echo '../../src/'`log/main.cpp
mv -f log/.deps/mesos_log-main.Tpo log/.deps/mesos_log-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-log log/mesos_log-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-log log/mesos_log-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT mesos/mesos_mesos-main.o -MD -MP -MF mesos/.deps/mesos_mesos-main.Tpo -c -o mesos/mesos_mesos-main.o `test -f 'mesos/main.cpp' || echo '../../src/'`mesos/main.cpp
mv -f mesos/.deps/mesos_mesos-main.Tpo mesos/.deps/mesos_mesos-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-mesos mesos/mesos_mesos-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-mesos mesos/mesos_mesos-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT launcher/mesos_launcher-main.o -MD -MP -MF launcher/.deps/mesos_launcher-main.Tpo -c -o launcher/mesos_launcher-main.o `test -f 'launcher/main.cpp' || echo '../../src/'`launcher/main.cpp
mv -f launcher/.deps/mesos_launcher-main.Tpo launcher/.deps/mesos_launcher-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-launcher launcher/mesos_launcher-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-launcher launcher/mesos_launcher-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT launcher/mesos_executor-executor.o -MD -MP -MF launcher/.deps/mesos_executor-executor.Tpo -c -o launcher/mesos_executor-executor.o `test -f 'launcher/executor.cpp' || echo '../../src/'`launcher/executor.cpp
mv -f launcher/.deps/mesos_executor-executor.Tpo launcher/.deps/mesos_executor-executor.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-executor launcher/mesos_executor-executor.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-executor launcher/mesos_executor-executor.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT master/mesos_master-main.o -MD -MP -MF master/.deps/mesos_master-main.Tpo -c -o master/mesos_master-main.o `test -f 'master/main.cpp' || echo '../../src/'`master/main.cpp
mv -f master/.deps/mesos_master-main.Tpo master/.deps/mesos_master-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-master master/mesos_master-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-master master/mesos_master-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT slave/mesos_slave-main.o -MD -MP -MF slave/.deps/mesos_slave-main.Tpo -c -o slave/mesos_slave-main.o `test -f 'slave/main.cpp' || echo '../../src/'`slave/main.cpp
mv -f slave/.deps/mesos_slave-main.Tpo slave/.deps/mesos_slave-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-slave slave/mesos_slave-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-slave slave/mesos_slave-main.o  ./.libs/libmesos.so -lrt -pthread
Building protobuf Python egg ...
Generating google/protobuf/descriptor_pb2.py...
Generating google/protobuf/compiler/plugin_pb2.py...
running bdist_egg
running egg_info
creating protobuf.egg-info
writing protobuf.egg-info/PKG-INFO
writing namespace_packages to protobuf.egg-info/namespace_packages.txt
writing top-level names to protobuf.egg-info/top_level.txt
writing dependency_links to protobuf.egg-info/dependency_links.txt
writing manifest file 'protobuf.egg-info/SOURCES.txt'
package init file 'google/protobuf/compiler/__init__.py' not found (or not a regular file)
reading manifest file 'protobuf.egg-info/SOURCES.txt'
writing manifest file 'protobuf.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib.linux-x86_64-2.7
creating build/lib.linux-x86_64-2.7/google
creating build/lib.linux-x86_64-2.7/google/protobuf
creating build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/__init__.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/api_implementation.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/containers.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/cpp_message.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/decoder.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/encoder.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/message_listener.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/python_message.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/type_checkers.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/wire_format.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/__init__.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/descriptor.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/descriptor_pb2.py -> build/lib.linux-x86_64-2.7/google/protobuf
creating build/lib.linux-x86_64-2.7/google/protobuf/compiler
copying google/protobuf/compiler/plugin_pb2.py -> build/lib.linux-x86_64-2.7/google/protobuf/compiler
copying google/protobuf/message.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/reflection.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/service.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/service_reflection.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/text_format.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/__init__.py -> build/lib.linux-x86_64-2.7/google
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/google
creating build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/descriptor_pb2.py -> build/bdist.linux-x86_64/egg/google/protobuf
creating build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/containers.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/encoder.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/decoder.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/api_implementation.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/type_checkers.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/cpp_message.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/wire_format.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/message_listener.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/__init__.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/python_message.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/descriptor.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/reflection.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/message.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/text_format.py -> build/bdist.linux-x86_64/egg/google/protobuf
creating build/bdist.linux-x86_64/egg/google/protobuf/compiler
copying build/lib.linux-x86_64-2.7/google/protobuf/compiler/plugin_pb2.py -> build/bdist.linux-x86_64/egg/google/protobuf/compiler
copying build/lib.linux-x86_64-2.7/google/protobuf/service.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/service_reflection.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/__init__.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/__init__.py -> build/bdist.linux-x86_64/egg/google
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/descriptor_pb2.py to descriptor_pb2.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/containers.py to containers.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/encoder.py to encoder.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/decoder.py to decoder.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/api_implementation.py to api_implementation.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/type_checkers.py to type_checkers.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/cpp_message.py to cpp_message.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/wire_format.py to wire_format.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/message_listener.py to message_listener.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/python_message.py to python_message.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/descriptor.py to descriptor.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/reflection.py to reflection.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/message.py to message.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/text_format.py to text_format.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/compiler/plugin_pb2.py to plugin_pb2.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/service.py to service.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/service_reflection.py to service_reflection.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/__init__.py to __init__.pyc
Creating missing __init__.py for google.protobuf.compiler
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/compiler/__init__.py to __init__.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/namespace_packages.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
creating dist
creating 'dist/protobuf-2.4.1-py2.7.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
Building Mesos Python egg ...
WARNING: '.' not a valid package name; please use only.-separated package names in setup.py
running bdist_egg
running egg_info
creating src/mesos.egg-info
writing src/mesos.egg-info/PKG-INFO
writing top-level names to src/mesos.egg-info/top_level.txt
writing dependency_links to src/mesos.egg-info/dependency_links.txt
writing manifest file 'src/mesos.egg-info/SOURCES.txt'
package init file 'src/__init__.py' not found (or not a regular file)
reading manifest file 'src/mesos.egg-info/SOURCES.txt'
writing manifest file 'src/mesos.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib.linux-x86_64-2.7
copying src/mesos.py -> build/lib.linux-x86_64-2.7
copying src/mesos_pb2.py -> build/lib.linux-x86_64-2.7
running build_ext
building '_mesos' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/native
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/../include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/python/native -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/protobuf-2.4.1/src -I/usr/include/python2.7 -c native/proxy_executor.cpp -o build/temp.linux-x86_64-2.7/native/proxy_executor.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
In file included from /usr/include/python2.7/Python.h:8:0,
                 from native/proxy_executor.hpp:22,
                 from native/proxy_executor.cpp:21:
/usr/include/python2.7/pyconfig.h:1161:0: warning: "_POSIX_C_SOURCE" redefined [enabled by default]
/usr/include/features.h:164:0: note: this is the location of the previous definition
/usr/include/python2.7/pyconfig.h:1183:0: warning: "_XOPEN_SOURCE" redefined [enabled by default]
/usr/include/features.h:166:0: note: this is the location of the previous definition
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/../include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/python/native -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/protobuf-2.4.1/src -I/usr/include/python2.7 -c native/mesos_executor_driver_impl.cpp -o build/temp.linux-x86_64-2.7/native/mesos_executor_driver_impl.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/../include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/python/native -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/protobuf-2.4.1/src -I/usr/include/python2.7 -c native/module.cpp -o build/temp.linux-x86_64-2.7/native/module.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/../include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/python/native -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/protobuf-2.4.1/src -I/usr/include/python2.7 -c native/proxy_scheduler.cpp -o build/temp.linux-x86_64-2.7/native/proxy_scheduler.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
In file included from /usr/include/python2.7/Python.h:8:0,
                 from native/proxy_scheduler.hpp:22,
                 from native/proxy_scheduler.cpp:21:
/usr/include/python2.7/pyconfig.h:1161:0: warning: "_POSIX_C_SOURCE" redefined [enabled by default]
/usr/include/features.h:164:0: note: this is the location of the previous definition
/usr/include/python2.7/pyconfig.h:1183:0: warning: "_XOPEN_SOURCE" redefined [enabled by default]
/usr/include/features.h:166:0: note: this is the location of the previous definition
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/../include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/include -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/python/native -I/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/protobuf-2.4.1/src -I/usr/include/python2.7 -c native/mesos_scheduler_driver_impl.cpp -o build/temp.linux-x86_64-2.7/native/mesos_scheduler_driver_impl.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro build/temp.linux-x86_64-2.7/native/proxy_executor.o build/temp.linux-x86_64-2.7/native/mesos_executor_driver_impl.o build/temp.linux-x86_64-2.7/native/module.o build/temp.linux-x86_64-2.7/native/proxy_scheduler.o build/temp.linux-x86_64-2.7/native/mesos_scheduler_driver_impl.o /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/.libs/libmesos_no_third_party.a /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/protobuf-2.4.1/src/.libs/libprotobuf.a /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/glog-0.3.1/.libs/libglog.a /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/leveldb/libleveldb.a /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/zookeeper-3.3.4/src/c/.libs/libzookeeper_mt.a /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/libprocess/.libs/libprocess.a /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/third_party/libprocess/third_party/libev-3.8/.libs/libev.a -o build/lib.linux-x86_64-2.7/_mesos.so -lrt
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
copying build/lib.linux-x86_64-2.7/mesos.py -> build/bdist.linux-x86_64/egg
copying build/lib.linux-x86_64-2.7/_mesos.so -> build/bdist.linux-x86_64/egg
copying build/lib.linux-x86_64-2.7/mesos_pb2.py -> build/bdist.linux-x86_64/egg
byte-compiling build/bdist.linux-x86_64/egg/mesos.py to mesos.pyc
byte-compiling build/bdist.linux-x86_64/egg/mesos_pb2.py to mesos_pb2.pyc
creating stub loader for _mesos.so
byte-compiling build/bdist.linux-x86_64/egg/_mesos.py to _mesos.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
writing build/bdist.linux-x86_64/egg/EGG-INFO/native_libs.txt
zip_safe flag not set; analyzing archive contents...
creating dist
creating 'dist/mesos-0.11.0-py2.7-linux-x86_64.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
make[2]: Leaving directory `/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src'
make[1]: Leaving directory `/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src'
Making all in ec2
make[1]: Entering directory `/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/ec2'
make[1]: Nothing to be done for `all'.
make[1]: Leaving directory `/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/ec2'
Making all in hadoop
make[1]: Entering directory `/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop'
make[1]: Nothing to be done for `all'.
make[1]: Leaving directory `/x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop'
+ cd hadoop
+ GLOG_v=1
+ MESOS_VERBOSE=1
+ make hadoop-0.20.205.0
if test "../.." != ".."; then \
          cp -p ../../hadoop/TUTORIAL.sh .; \
          cp -p ../../hadoop/hadoop-0.20.205.0.patch .; \
          cp -p ../../hadoop/hadoop-0.20.205.0_hadoop-env.sh.patch .; \
          cp -p ../../hadoop/hadoop-0.20.205.0_mesos.patch .; \
          cp -p ../../hadoop/mapred-site.xml.patch .; \
          cp -rp ../../hadoop/mesos .; \
          cp -p ../../hadoop/mesos-executor .; \
        fi
rm -rf hadoop-0.20.205.0

Welcome to the tutorial on running Apache Hadoop on top of Mesos!
During this interactive guide we'll ask some yes/no
questions and you should enter your answer via 'Y' or 'y' for yes and
'N' or 'n' for no.

Let's begin!


We'll try and grab hadoop-0.20.205.0 for you now via:

  $ wget http://apache.cs.utah.edu/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz


--2013-05-22 14:38:07--  http://apache.cs.utah.edu/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz
Resolving apache.cs.utah.edu (apache.cs.utah.edu)... 155.98.64.87
Connecting to apache.cs.utah.edu (apache.cs.utah.edu)|155.98.64.87|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2013-05-22 14:38:07 ERROR 404: Not Found.


Oh no! We failed to run 'wget http://apache.cs.utah.edu/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz'. If you need help try emailing:

  mesos-dev@incubator.apache.org

(Remember to include as much debug information as possible.)

make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure

Re: Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #20

Posted by Vinod Kone <vi...@twitter.com>.
Disabled hadoop jobs on jenkins, because we still have an outstanding bug.
https://issues.apache.org/jira/browse/MESOS-480


@vinodkone


On Wed, Jun 19, 2013 at 9:15 AM, Apache Jenkins Server <
jenkins@builds.apache.org> wrote:

> See <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/20/>
>
> ------------------------------------------
> [...truncated 7353 lines...]
>       [get] To: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
> >
>       [get] Not modified - so not downloaded
>
> ivy-probe-antlib:
>
> ivy-init-antlib:
>
> ivy-init:
> [ivy:configure] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> ivy-resolve-common:
> [ivy:resolve] :: resolving dependencies ::
> org.apache.hadoop#streaming;working@janus
> [ivy:resolve]   confs: [common]
> [ivy:resolve]   found commons-cli#commons-cli;1.2 in default
> [ivy:resolve]   found commons-logging#commons-logging;1.0.4 in maven2
> [ivy:resolve]   found junit#junit;4.5 in maven2
> [ivy:resolve]   found org.mortbay.jetty#jetty-util;6.1.26 in maven2
> [ivy:resolve]   found org.mortbay.jetty#jetty;6.1.26 in maven2
> [ivy:resolve]   found org.mortbay.jetty#servlet-api;2.5-20081211 in default
> [ivy:resolve]   found asm#asm;3.2 in default
> [ivy:resolve]   found com.sun.jersey#jersey-core;1.8 in default
> [ivy:resolve]   found com.sun.jersey#jersey-json;1.8 in default
> [ivy:resolve]   found com.sun.jersey#jersey-server;1.8 in default
> [ivy:resolve]   found commons-httpclient#commons-httpclient;3.0.1 in maven2
> [ivy:resolve]   found log4j#log4j;1.2.15 in maven2
> [ivy:resolve]   found commons-codec#commons-codec;1.4 in default
> [ivy:resolve]   found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in
> default
> [ivy:resolve]   found org.codehaus.jackson#jackson-core-asl;1.0.1 in
> default
> [ivy:resolve]   found commons-configuration#commons-configuration;1.6 in
> default
> [ivy:resolve]   found commons-collections#commons-collections;3.2.1 in
> default
> [ivy:resolve]   found commons-lang#commons-lang;2.4 in default
> [ivy:resolve]   found commons-logging#commons-logging;1.1.1 in default
> [ivy:resolve]   found commons-digester#commons-digester;1.8 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils;1.7.0 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils-core;1.8.0 in
> default
> [ivy:resolve]   found org.apache.commons#commons-math;2.1 in maven2
> [ivy:resolve] :: resolution report :: resolve 129ms :: artifacts dl 7ms
> [ivy:resolve]   :: evicted modules:
> [ivy:resolve]   commons-logging#commons-logging;1.0.4 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.0.3 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.1 by
> [commons-logging#commons-logging;1.1.1] in [common]
>
> ---------------------------------------------------------------------
>         |                  |            modules            ||   artifacts
>   |
>         |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
> ---------------------------------------------------------------------
>         |      common      |   25  |   0   |   0   |   3   ||   22  |   0
>   |
>
> ---------------------------------------------------------------------
>
> ivy-retrieve-common:
> [ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
> [ivy:retrieve]  confs: [common]
> [ivy:retrieve]  0 artifacts copied, 22 already retrieved (0kB/5ms)
> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
> 'ivy.settings.file' instead
> [ivy:cachepath] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> compile:
>      [echo] contrib: streaming
>     [javac] <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
> warning: 'includeantruntime' was not set, defaulting to
> build.sysclasspath=last; set to false for repeatable builds
>
> jar:
>       [jar] Building jar: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar
> >
>
> compile-examples:
>
> jar-examples:
>
> package:
>     [mkdir] Created dir: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming
> >
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming
> >
>
> check-contrib:
>
> init:
>      [echo] contrib: thriftfs
>
> init-contrib:
>
> ivy-download:
>       [get] Getting:
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
>       [get] To: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
> >
>       [get] Not modified - so not downloaded
>
> ivy-probe-antlib:
>
> ivy-init-antlib:
>
> ivy-init:
> [ivy:configure] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> ivy-resolve-common:
> [ivy:resolve] :: resolving dependencies ::
> org.apache.hadoop#thriftfs;working@janus
> [ivy:resolve]   confs: [common]
> [ivy:resolve]   found commons-logging#commons-logging;1.0.4 in maven2
> [ivy:resolve]   found log4j#log4j;1.2.15 in maven2
> [ivy:resolve]   found commons-configuration#commons-configuration;1.6 in
> default
> [ivy:resolve]   found commons-collections#commons-collections;3.2.1 in
> default
> [ivy:resolve]   found commons-lang#commons-lang;2.4 in default
> [ivy:resolve]   found commons-logging#commons-logging;1.1.1 in default
> [ivy:resolve]   found commons-digester#commons-digester;1.8 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils;1.7.0 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils-core;1.8.0 in
> default
> [ivy:resolve]   found org.apache.commons#commons-math;2.1 in maven2
> [ivy:resolve] :: resolution report :: resolve 52ms :: artifacts dl 3ms
> [ivy:resolve]   :: evicted modules:
> [ivy:resolve]   commons-logging#commons-logging;1.0.4 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.0.3 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.1 by
> [commons-logging#commons-logging;1.1.1] in [common]
>
> ---------------------------------------------------------------------
>         |                  |            modules            ||   artifacts
>   |
>         |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
> ---------------------------------------------------------------------
>         |      common      |   12  |   0   |   0   |   3   ||   9   |   0
>   |
>
> ---------------------------------------------------------------------
>
> ivy-retrieve-common:
> [ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
> [ivy:retrieve]  confs: [common]
> [ivy:retrieve]  0 artifacts copied, 9 already retrieved (0kB/3ms)
> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
> 'ivy.settings.file' instead
> [ivy:cachepath] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> compile:
>      [echo] contrib: thriftfs
>     [javac] <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
> warning: 'includeantruntime' was not set, defaulting to
> build.sysclasspath=last; set to false for repeatable builds
>
> jar:
>       [jar] Building jar: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar
> >
>
> compile-examples:
>
> jar-examples:
>
> package:
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib
> >
>
> init:
>
> ivy-download:
>       [get] Getting:
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
>       [get] To: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
> >
>       [get] Not modified - so not downloaded
>
> ivy-probe-antlib:
>
> ivy-init-antlib:
>
> ivy-init:
> [ivy:configure] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> ivy-resolve-common:
> [ivy:resolve] :: resolving dependencies ::
> org.apache.hadoop#vaidya;working@janus
> [ivy:resolve]   confs: [common]
> [ivy:resolve]   found commons-logging#commons-logging;1.0.4 in maven2
> [ivy:resolve]   found log4j#log4j;1.2.15 in maven2
> [ivy:resolve] :: resolution report :: resolve 10ms :: artifacts dl 1ms
>
> ---------------------------------------------------------------------
>         |                  |            modules            ||   artifacts
>   |
>         |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
> ---------------------------------------------------------------------
>         |      common      |   2   |   0   |   0   |   0   ||   2   |   0
>   |
>
> ---------------------------------------------------------------------
>
> ivy-retrieve-common:
> [ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
> [ivy:retrieve]  confs: [common]
> [ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/1ms)
> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
> 'ivy.settings.file' instead
> [ivy:cachepath] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> compile:
>      [echo] contrib: vaidya
>     [javac] <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
> warning: 'includeantruntime' was not set, defaulting to
> build.sysclasspath=last; set to false for repeatable builds
>
> jar:
>      [echo] contrib: vaidya
>       [jar] Building jar: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar
> >
>
> package:
>     [mkdir] Created dir: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya
> >
>      [copy] Copying 3 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya
> >
>      [copy] Copying 35 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps
> >
>      [copy] Copied 13 empty directories to 2 empty directories under <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps
> >
>      [copy] Copying 5 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop
> >
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin
> >
>      [copy] Copying 16 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin
> >
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec
> >
>      [copy] Copying 16 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop
> >
>      [copy] Copying 4 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop
> >
>      [copy] Copying 7 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin
> >
>
> BUILD SUCCESSFUL
> Total time: 1 minute 4 seconds
>
> To build the Mesos executor package, we first copy the
> necessary Mesos libraries.
>
>
>   $ cd build/hadoop-0.20.205.0
>   $ mkdir -p lib/native/Linux-amd64-64
>   $ cp <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos.so>
> lib/native/Linux-amd64-64
>
>
>
>   Finally, we will build the Mesos executor package as follows:
>
>
>   $ cd ..
>   $ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
>   $ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/
>
>
>
> Build success!
>
> The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'
>
> Now let's run something!
>
> We'll try and start the JobTracker from the Mesos distribution path via:
>   $ cd hadoop-0.20.205.0-mesos
>   $ ./bin/hadoop jobtracker
>
>
>
> JobTracker started at 4661.
>
> Waiting 5 seconds for it to start. . . . . .
> Alright, now let's run the "wordcount" example via:
>
>   $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount   <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/mesos>
> out
>
>
> Exception in thread "main" java.io.IOException: Error opening job jar:
> hadoop-examples-0.20.205.0.jar
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
> Caused by: java.util.zip.ZipException: error in opening zip file
>         at java.util.zip.ZipFile.open(Native Method)
>         at java.util.zip.ZipFile.<init>(ZipFile.java:114)
>         at java.util.jar.JarFile.<init>(JarFile.java:135)
>         at java.util.jar.JarFile.<init>(JarFile.java:72)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
>
> Oh no, it failed! Try running the JobTracker and wordcount
> example manually ... it might be an issue with your environment that
> this tutorial didn't cover (if you find this to be the case, please
> create a JIRA for us and/or send us a code review).
>
> ./TUTORIAL.sh: line 704: kill: (4661) - No such process
> make: *** [hadoop-0.20.205.0] Error 1
> Build step 'Execute shell' marked build as failure
>

Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #20

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/20/>

------------------------------------------
[...truncated 7353 lines...]
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#streaming;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in default
[ivy:resolve] 	found asm#asm;3.2 in default
[ivy:resolve] 	found com.sun.jersey#jersey-core;1.8 in default
[ivy:resolve] 	found com.sun.jersey#jersey-json;1.8 in default
[ivy:resolve] 	found com.sun.jersey#jersey-server;1.8 in default
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in default
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in default
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in default
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in default
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in default
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 129ms :: artifacts dl 7ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   25  |   0   |   0   |   3   ||   22  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 22 already retrieved (0kB/5ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: streaming
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>

check-contrib:

init:
     [echo] contrib: thriftfs

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in default
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in default
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 52ms :: artifacts dl 3ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   12  |   0   |   0   |   3   ||   9   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 9 already retrieved (0kB/3ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: thriftfs
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib>

init:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 10ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 2 already retrieved (0kB/1ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: vaidya
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
     [echo] contrib: vaidya
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar>

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 3 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 35 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copied 13 empty directories to 2 empty directories under <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copying 5 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop>
     [copy] Copying 4 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop>
     [copy] Copying 7 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>

BUILD SUCCESSFUL
Total time: 1 minute 4 seconds

To build the Mesos executor package, we first copy the
necessary Mesos libraries.


  $ cd build/hadoop-0.20.205.0
  $ mkdir -p lib/native/Linux-amd64-64
  $ cp <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos.so> lib/native/Linux-amd64-64



  Finally, we will build the Mesos executor package as follows:


  $ cd ..
  $ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
  $ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/



Build success!

The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'

Now let's run something!

We'll try and start the JobTracker from the Mesos distribution path via:
  $ cd hadoop-0.20.205.0-mesos
  $ ./bin/hadoop jobtracker



JobTracker started at 4661.

Waiting 5 seconds for it to start. . . . . .
Alright, now let's run the "wordcount" example via:

  $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount   <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/mesos> out


Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-examples-0.20.205.0.jar
	at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
	at java.util.zip.ZipFile.open(Native Method)
	at java.util.zip.ZipFile.<init>(ZipFile.java:114)
	at java.util.jar.JarFile.<init>(JarFile.java:135)
	at java.util.jar.JarFile.<init>(JarFile.java:72)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:88)

Oh no, it failed! Try running the JobTracker and wordcount
example manually ... it might be an issue with your environment that
this tutorial didn't cover (if you find this to be the case, please
create a JIRA for us and/or send us a code review).

./TUTORIAL.sh: line 704: kill: (4661) - No such process
make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #19

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/19/>

------------------------------------------
[...truncated 7320 lines...]
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#streaming;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found asm#asm;3.2 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-core;1.8 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-json;1.8 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-server;1.8 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 110ms :: artifacts dl 8ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   25  |   0   |   0   |   3   ||   22  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 22 already retrieved (0kB/4ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: streaming
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>

check-contrib:

init:
     [echo] contrib: thriftfs

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 53ms :: artifacts dl 3ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   12  |   0   |   0   |   3   ||   9   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 9 already retrieved (0kB/2ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: thriftfs
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib>

init:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 11ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 2 already retrieved (0kB/2ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: vaidya
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
     [echo] contrib: vaidya
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar>

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 3 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 35 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copied 13 empty directories to 2 empty directories under <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copying 5 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop>
     [copy] Copying 4 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop>
     [copy] Copying 7 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>

BUILD SUCCESSFUL
Total time: 1 minute 2 seconds

To build the Mesos executor package, we first copy the
necessary Mesos libraries.


  $ cd build/hadoop-0.20.205.0
  $ mkdir -p lib/native/Linux-amd64-64
  $ cp <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos.so> lib/native/Linux-amd64-64



  Finally, we will build the Mesos executor package as follows:


  $ cd ..
  $ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
  $ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/



Build success!

The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'

Now let's run something!

We'll try and start the JobTracker from the Mesos distribution path via:
  $ cd hadoop-0.20.205.0-mesos
  $ ./bin/hadoop jobtracker



JobTracker started at 23827.

Waiting 5 seconds for it to start. . . . . .
Alright, now let's run the "wordcount" example via:

  $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount   <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/mesos> out


Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-examples-0.20.205.0.jar
	at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
	at java.util.zip.ZipFile.open(Native Method)
	at java.util.zip.ZipFile.<init>(ZipFile.java:114)
	at java.util.jar.JarFile.<init>(JarFile.java:135)
	at java.util.jar.JarFile.<init>(JarFile.java:72)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:88)

Oh no, it failed! Try running the JobTracker and wordcount
example manually ... it might be an issue with your environment that
this tutorial didn't cover (if you find this to be the case, please
create a JIRA for us and/or send us a code review).

./TUTORIAL.sh: line 704: kill: (23827) - No such process
make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #18

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/18/>

------------------------------------------
[...truncated 7342 lines...]
      [get] To: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#streaming;working@hemera
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in default
[ivy:resolve] 	found asm#asm;3.2 in default
[ivy:resolve] 	found com.sun.jersey#jersey-core;1.8 in default
[ivy:resolve] 	found com.sun.jersey#jersey-json;1.8 in default
[ivy:resolve] 	found com.sun.jersey#jersey-server;1.8 in default
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in default
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in default
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in default
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in default
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in maven2
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in default
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 87ms :: artifacts dl 6ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   25  |   0   |   0   |   3   ||   22  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 22 already retrieved (0kB/3ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml

compile:
     [echo] contrib: streaming
    [javac] /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar

compile-examples:

jar-examples:

package:
    [mkdir] Created dir: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming
     [copy] Copying 1 file to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming

check-contrib:

init:
     [echo] contrib: thriftfs

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@hemera
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in default
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in maven2
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in default
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in default
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 39ms :: artifacts dl 2ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   12  |   0   |   0   |   3   ||   9   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 9 already retrieved (0kB/2ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml

compile:
     [echo] contrib: thriftfs
    [javac] /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar

compile-examples:

jar-examples:

package:
     [copy] Copying 1 file to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib

init:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@hemera
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 9ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 2 already retrieved (0kB/1ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml

compile:
     [echo] contrib: vaidya
    [javac] /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
     [echo] contrib: vaidya
      [jar] Building jar: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar

package:
    [mkdir] Created dir: /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya
     [copy] Copying 3 files to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya
     [copy] Copying 35 files to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps
     [copy] Copied 13 empty directories to 2 empty directories under /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps
     [copy] Copying 5 files to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop
     [copy] Copying 1 file to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin
     [copy] Copying 16 files to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin
     [copy] Copying 1 file to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec
     [copy] Copying 16 files to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop
     [copy] Copying 4 files to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop
     [copy] Copying 7 files to /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin

BUILD SUCCESSFUL
Total time: 47 seconds

To build the Mesos executor package, we first copy the
necessary Mesos libraries.


  $ cd build/hadoop-0.20.205.0
  $ mkdir -p lib/native/Linux-amd64-64
  $ cp /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/.libs/libmesos.so lib/native/Linux-amd64-64



  Finally, we will build the Mesos executor package as follows:


  $ cd ..
  $ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
  $ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/



Build success!

The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'

Now let's run something!

We'll try and start the JobTracker from the Mesos distribution path via:
  $ cd hadoop-0.20.205.0-mesos
  $ ./bin/hadoop jobtracker



JobTracker started at 7375.

Waiting 5 seconds for it to start. . . . . .
Alright, now let's run the "wordcount" example via:

  $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount   /x1/jenkins/jenkins-slave/workspace/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/build/src/mesos out


Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-examples-0.20.205.0.jar
	at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
	at java.util.zip.ZipFile.open(Native Method)
	at java.util.zip.ZipFile.<init>(ZipFile.java:114)
	at java.util.jar.JarFile.<init>(JarFile.java:135)
	at java.util.jar.JarFile.<init>(JarFile.java:72)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:88)

Oh no, it failed! Try running the JobTracker and wordcount
example manually ... it might be an issue with your environment that
this tutorial didn't cover (if you find this to be the case, please
create a JIRA for us and/or send us a code review).

./TUTORIAL.sh: line 704: kill: (7375) - No such process
make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #17

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/17/>

------------------------------------------
[...truncated 7331 lines...]
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#streaming;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found asm#asm;3.2 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-core;1.8 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-json;1.8 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-server;1.8 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 134ms :: artifacts dl 8ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   25  |   0   |   0   |   3   ||   22  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 22 already retrieved (0kB/4ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: streaming
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>

check-contrib:

init:
     [echo] contrib: thriftfs

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 55ms :: artifacts dl 3ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   12  |   0   |   0   |   3   ||   9   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 9 already retrieved (0kB/3ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: thriftfs
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib>

init:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 12ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 2 already retrieved (0kB/1ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: vaidya
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
     [echo] contrib: vaidya
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar>

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 3 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 35 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copied 13 empty directories to 2 empty directories under <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copying 5 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop>
     [copy] Copying 4 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop>
     [copy] Copying 7 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>

BUILD SUCCESSFUL
Total time: 1 minute 7 seconds

To build the Mesos executor package, we first copy the
necessary Mesos libraries.


  $ cd build/hadoop-0.20.205.0
  $ mkdir -p lib/native/Linux-amd64-64
  $ cp <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos.so> lib/native/Linux-amd64-64



  Finally, we will build the Mesos executor package as follows:


  $ cd ..
  $ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
  $ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/



Build success!

The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'

Now let's run something!

We'll try and start the JobTracker from the Mesos distribution path via:
  $ cd hadoop-0.20.205.0-mesos
  $ ./bin/hadoop jobtracker



JobTracker started at 29827.

Waiting 5 seconds for it to start. . . . . .
Alright, now let's run the "wordcount" example via:

  $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount   <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/mesos> out


Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-examples-0.20.205.0.jar
	at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
	at java.util.zip.ZipFile.open(Native Method)
	at java.util.zip.ZipFile.<init>(ZipFile.java:114)
	at java.util.jar.JarFile.<init>(JarFile.java:135)
	at java.util.jar.JarFile.<init>(JarFile.java:72)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:88)

Oh no, it failed! Try running the JobTracker and wordcount
example manually ... it might be an issue with your environment that
this tutorial didn't cover (if you find this to be the case, please
create a JIRA for us and/or send us a code review).

./TUTORIAL.sh: line 704: kill: (29827) - No such process
make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #16

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/16/changes>

Changes:

[vinod] Fixed scheduler driver to call disconnected() when master fails over.

[vinod] Fixed master to send a FrameworkReregistered message when the

[vinod] Fixed release script to use git instead of svn.

[vinod] Fixed a bug in the release script regarding tag creation.

[benh] Duration related refactoring changes.

[benh] Time related refactoring changes.

[benh] Duration-Time related refactoring changes.

[benh] Performed GTEST_IS_THREADSAFE check.

[benh] Removed unnecessary use of Option.

[benh] Refactored zookeeper_tests.cpp into master_detector_tests.cpp and

[benh] Made tests flags inherit from logging.

[benh] Moved 'tests::mkdtemp' to Environment.

[benh] Minor cleanup in stout/exit.hpp.

[benh] Improved library for using JVM/JNI and updated uses (in tests).

[benh] Removed unused variables in AllocatorZooKeeperTest.

[benh] Renamed tests/zookeeper_test.hpp|cpp to zookeeper.hpp|cpp.

[benh] Refactored MesosTest/MesosClusterTest into a generic fixture for

[benh] Cleaned up the output from running and automagically disabling the

[benh] Used Milliseconds rather than Duration::parse.

[benh] Removed unnecessary TestingIsolators.

[benh] A little spring cleaning in the allocator tests.

[benh] More spring cleaning, this time in the slave recovery tets.

[benh] Replaced local::launch in tests with MesosTest.

[benh] Removed unused local::launch overload.

[benh] Cleanups in configure.ac.

[benh] Refactored base 'State' implementation to be serialization agnostic

[benh] Added a 'port' field to SlaveInfo and updated default master and slave

[benh] Fixed output bug with CHECK_SOME.

[benh] Added some helpers for failing a collection of futures.

[benh] Fixed synchronization bug when waiting for a process.

[benh] Fixed bug where we didn't stop all MesosExecutorDrivers when using the

[benh] Updated MonitorTest.WatchUnwatch to be deterministic.

[benh] Removed a using directive that causes compilation to fail when

[benh] Only build group tests with Java.

[vinod] Added DISCLAIMER to the distribution.

[vinod] Fixed NOTICE and LICENSE.

[benh] Fix for bug using 'TRUE' and 'FALSE' as identifiers on OS X.

[benh] Moved flags to stout.

[benh] Replaced flags and configurator in Mesos with flags in stout.

[benh] Added a 'Logging' process to libprocess.

[benh] Removed logging process from Mesos (now in libprocess).

[benh] Updated libprocess to use '3rdparty' instead of 'third_party'.

[benh] Renamed 'third_party' to '3rdparty'.

[benh] Added stout specific 'CHECK' constructs.

[benh] Replaced Mesos CHECK_SOME with stout CHECK_SOME.

[benh] Added 'ThreadLocal' to stout/thread.hpp.

[benh] Used ThreadLocal from stout.

[benh] Fixes os::environ for OS X.

[vinod] Updated version to 0.14.0.

[benh] Add Slave and Framework struct to HierarchicalAllocatorProcess. Cleans

[benh] Added a retry option to cgroups::mount in order to deal with a bug

[vinod] Fixed Zookeeper to recursively create parent paths as necessary.

[vinod] Exposed version in "/vars" and "/state.json" endpoints.

[vinod] Fixed slave to not send tasks and executor info of a terminated executor,

[bmahler] Updated the NOTICE to include the correct year, and to fix line

[vinod] Fixed slave to properly handle terminated tasks that have pending

[vinod] Fixed master to properly do task reconciliation when slave re-registers.

[vinod] Added a new 'statistics.json' endpoint to the ResourceMonitor, this

[bmahler] Updated 3rd_party licences in the LICENCE file.

[bmahler] Updated the CHANGELOG for 0.12.0.

[bmahler] Added a master detector document.

[bmahler] Fixed the name of the master detector document.

[bmahler] Added an Upgrade document.

[bmahler] Updated the CHANGELOG with additional tickets fixed in 0.12.0.

[bmahler] Fixed a typo in the Master Detection filename.

[vinod] Updated CHANGELOG for 0.13.0.

[bmahler] Updated release tag format in the release script to use the new

[brenden.matthews] Run Hadoop tutorial binaries from within build dir.

[brenden.matthews] Build fix for HadoopPipes.cc with GCC 4.7.

[brenden.matthews] Hadoop tutorial version bump (CDH4.2.0 -> 4.2.1).

[vinod] Fixed slave to properly handle duplicate terminal updates for the

[vinod] Updated CHANGELOG for 0.13.0 (rc2).

[brenden.matthews] WebUI: Use slave hostname rather than libprocess.

[bmahler] Fixed libprocess tests to write to stderr when using newer versions

[bmahler] Fixed mesos tests to write to stderr when using newer versions of

[bmahler] Fixed a bug in the Slave's logging.

[woggle] Update deploy scripts.

[benh] Added os::sysctl to stout.

------------------------------------------
[...truncated 7348 lines...]
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#streaming;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found asm#asm;3.2 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-core;1.8 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-json;1.8 in maven2
[ivy:resolve] 	found com.sun.jersey#jersey-server;1.8 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 113ms :: artifacts dl 7ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   25  |   0   |   0   |   3   ||   22  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 22 already retrieved (0kB/4ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: streaming
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>

check-contrib:

init:
     [echo] contrib: thriftfs

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 54ms :: artifacts dl 3ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   12  |   0   |   0   |   3   ||   9   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 9 already retrieved (0kB/3ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: thriftfs
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar>

compile-examples:

jar-examples:

package:
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib>

init:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@janus
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 14ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 2 already retrieved (0kB/1ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>

compile:
     [echo] contrib: vaidya
    [javac] <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
     [echo] contrib: vaidya
      [jar] Building jar: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar>

package:
    [mkdir] Created dir: <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 3 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
     [copy] Copying 35 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copied 13 empty directories to 2 empty directories under <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
     [copy] Copying 5 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>
     [copy] Copying 1 file to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec>
     [copy] Copying 16 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop>
     [copy] Copying 4 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop>
     [copy] Copying 7 files to <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>

BUILD SUCCESSFUL
Total time: 1 minute 1 second

To build the Mesos executor package, we first copy the
necessary Mesos libraries.


  $ cd build/hadoop-0.20.205.0
  $ mkdir -p lib/native/Linux-amd64-64
  $ cp <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos.so> lib/native/Linux-amd64-64



  Finally, we will build the Mesos executor package as follows:


  $ cd ..
  $ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
  $ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/



Build success!

The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'

Now let's run something!

We'll try and start the JobTracker from the Mesos distribution path via:
  $ cd hadoop-0.20.205.0-mesos
  $ ./bin/hadoop jobtracker



JobTracker started at 4901.

Waiting 5 seconds for it to start. . . . . .
Alright, now let's run the "wordcount" example via:

  $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount   <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/mesos> out


Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-examples-0.20.205.0.jar
	at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
	at java.util.zip.ZipFile.open(Native Method)
	at java.util.zip.ZipFile.<init>(ZipFile.java:114)
	at java.util.jar.JarFile.<init>(JarFile.java:135)
	at java.util.jar.JarFile.<init>(JarFile.java:72)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:88)

Oh no, it failed! Try running the JobTracker and wordcount
example manually ... it might be an issue with your environment that
this tutorial didn't cover (if you find this to be the case, please
create a JIRA for us and/or send us a code review).

./TUTORIAL.sh: line 704: kill: (4901) - No such process
make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0 #15

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/15/>

------------------------------------------
[...truncated 2423 lines...]
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT local/mesos_local-main.o -MD -MP -MF local/.deps/mesos_local-main.Tpo -c -o local/mesos_local-main.o `test -f 'local/main.cpp' || echo '../../src/'`local/main.cpp
mv -f local/.deps/mesos_local-main.Tpo local/.deps/mesos_local-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-local local/mesos_local-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-local local/mesos_local-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT log/mesos_log-main.o -MD -MP -MF log/.deps/mesos_log-main.Tpo -c -o log/mesos_log-main.o `test -f 'log/main.cpp' || echo '../../src/'`log/main.cpp
mv -f log/.deps/mesos_log-main.Tpo log/.deps/mesos_log-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-log log/mesos_log-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-log log/mesos_log-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT mesos/mesos_mesos-main.o -MD -MP -MF mesos/.deps/mesos_mesos-main.Tpo -c -o mesos/mesos_mesos-main.o `test -f 'mesos/main.cpp' || echo '../../src/'`mesos/main.cpp
mv -f mesos/.deps/mesos_mesos-main.Tpo mesos/.deps/mesos_mesos-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-mesos mesos/mesos_mesos-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-mesos mesos/mesos_mesos-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT launcher/mesos_launcher-main.o -MD -MP -MF launcher/.deps/mesos_launcher-main.Tpo -c -o launcher/mesos_launcher-main.o `test -f 'launcher/main.cpp' || echo '../../src/'`launcher/main.cpp
mv -f launcher/.deps/mesos_launcher-main.Tpo launcher/.deps/mesos_launcher-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-launcher launcher/mesos_launcher-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-launcher launcher/mesos_launcher-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT launcher/mesos_executor-executor.o -MD -MP -MF launcher/.deps/mesos_executor-executor.Tpo -c -o launcher/mesos_executor-executor.o `test -f 'launcher/executor.cpp' || echo '../../src/'`launcher/executor.cpp
mv -f launcher/.deps/mesos_executor-executor.Tpo launcher/.deps/mesos_executor-executor.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-executor launcher/mesos_executor-executor.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-executor launcher/mesos_executor-executor.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT master/mesos_master-main.o -MD -MP -MF master/.deps/mesos_master-main.Tpo -c -o master/mesos_master-main.o `test -f 'master/main.cpp' || echo '../../src/'`master/main.cpp
mv -f master/.deps/mesos_master-main.Tpo master/.deps/mesos_master-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-master master/mesos_master-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-master master/mesos_master-main.o  ./.libs/libmesos.so -lrt -pthread
g++ -DPACKAGE_NAME=\"mesos\" -DPACKAGE_TARNAME=\"mesos\" -DPACKAGE_VERSION=\"0.11.0\" -DPACKAGE_STRING=\"mesos\ 0.11.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DPACKAGE=\"mesos\" -DVERSION=\"0.11.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_PTHREAD=1 -DMESOS_HAS_JAVA=1 -DMESOS_HAS_PYTHON=1 -I. -I../../src   -Wall -Werror -DMESOS_WEBUI_DIR=\"/usr/local/share/mesos/webui\" -DMESOS_LIBEXECDIR=\"/usr/local/libexec/mesos\" -I../../include -I../../third_party/libprocess/include -I../include -I../third_party/boost-1.51.0 -I../third_party/protobuf-2.4.1/src -I../third_party/glog-0.3.1/src -I../third_party/zookeeper-3.3.4/src/c/include -I../third_party/zookeeper-3.3.4/src/c/generated  -pthread -g2 -O2 -MT slave/mesos_slave-main.o -MD -MP -MF slave/.deps/mesos_slave-main.Tpo -c -o slave/mesos_slave-main.o `test -f 'slave/main.cpp' || echo '../../src/'`slave/main.cpp
mv -f slave/.deps/mesos_slave-main.Tpo slave/.deps/mesos_slave-main.Po
/bin/bash ../libtool --tag=CXX   --mode=link g++ -pthread -g2 -O2   -o mesos-slave slave/mesos_slave-main.o libmesos.la -lrt
libtool: link: g++ -pthread -g2 -O2 -o .libs/mesos-slave slave/mesos_slave-main.o  ./.libs/libmesos.so -lrt -pthread
Building protobuf Python egg ...
Generating google/protobuf/descriptor_pb2.py...
Generating google/protobuf/compiler/plugin_pb2.py...
running bdist_egg
running egg_info
creating protobuf.egg-info
writing protobuf.egg-info/PKG-INFO
writing namespace_packages to protobuf.egg-info/namespace_packages.txt
writing top-level names to protobuf.egg-info/top_level.txt
writing dependency_links to protobuf.egg-info/dependency_links.txt
writing manifest file 'protobuf.egg-info/SOURCES.txt'
package init file 'google/protobuf/compiler/__init__.py' not found (or not a regular file)
reading manifest file 'protobuf.egg-info/SOURCES.txt'
writing manifest file 'protobuf.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib.linux-x86_64-2.7
creating build/lib.linux-x86_64-2.7/google
creating build/lib.linux-x86_64-2.7/google/protobuf
creating build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/__init__.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/api_implementation.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/containers.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/cpp_message.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/decoder.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/encoder.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/message_listener.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/python_message.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/type_checkers.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/internal/wire_format.py -> build/lib.linux-x86_64-2.7/google/protobuf/internal
copying google/protobuf/__init__.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/descriptor.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/descriptor_pb2.py -> build/lib.linux-x86_64-2.7/google/protobuf
creating build/lib.linux-x86_64-2.7/google/protobuf/compiler
copying google/protobuf/compiler/plugin_pb2.py -> build/lib.linux-x86_64-2.7/google/protobuf/compiler
copying google/protobuf/message.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/reflection.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/service.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/service_reflection.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/protobuf/text_format.py -> build/lib.linux-x86_64-2.7/google/protobuf
copying google/__init__.py -> build/lib.linux-x86_64-2.7/google
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/google
creating build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/text_format.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/message.py -> build/bdist.linux-x86_64/egg/google/protobuf
creating build/bdist.linux-x86_64/egg/google/protobuf/compiler
copying build/lib.linux-x86_64-2.7/google/protobuf/compiler/plugin_pb2.py -> build/bdist.linux-x86_64/egg/google/protobuf/compiler
copying build/lib.linux-x86_64-2.7/google/protobuf/service_reflection.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/__init__.py -> build/bdist.linux-x86_64/egg/google/protobuf
creating build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/encoder.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/decoder.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/type_checkers.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/api_implementation.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/containers.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/wire_format.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/message_listener.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/__init__.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/cpp_message.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/internal/python_message.py -> build/bdist.linux-x86_64/egg/google/protobuf/internal
copying build/lib.linux-x86_64-2.7/google/protobuf/descriptor_pb2.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/descriptor.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/service.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/protobuf/reflection.py -> build/bdist.linux-x86_64/egg/google/protobuf
copying build/lib.linux-x86_64-2.7/google/__init__.py -> build/bdist.linux-x86_64/egg/google
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/text_format.py to text_format.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/message.py to message.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/compiler/plugin_pb2.py to plugin_pb2.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/service_reflection.py to service_reflection.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/encoder.py to encoder.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/decoder.py to decoder.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/type_checkers.py to type_checkers.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/api_implementation.py to api_implementation.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/containers.py to containers.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/wire_format.py to wire_format.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/message_listener.py to message_listener.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/cpp_message.py to cpp_message.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/internal/python_message.py to python_message.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/descriptor_pb2.py to descriptor_pb2.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/descriptor.py to descriptor.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/service.py to service.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/reflection.py to reflection.pyc
byte-compiling build/bdist.linux-x86_64/egg/google/__init__.py to __init__.pyc
Creating missing __init__.py for google.protobuf.compiler
byte-compiling build/bdist.linux-x86_64/egg/google/protobuf/compiler/__init__.py to __init__.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/namespace_packages.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying protobuf.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
creating dist
creating 'dist/protobuf-2.4.1-py2.7.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
Building Mesos Python egg ...
WARNING: '.' not a valid package name; please use only.-separated package names in setup.py
running bdist_egg
running egg_info
creating src/mesos.egg-info
writing src/mesos.egg-info/PKG-INFO
writing top-level names to src/mesos.egg-info/top_level.txt
writing dependency_links to src/mesos.egg-info/dependency_links.txt
writing manifest file 'src/mesos.egg-info/SOURCES.txt'
package init file 'src/__init__.py' not found (or not a regular file)
reading manifest file 'src/mesos.egg-info/SOURCES.txt'
writing manifest file 'src/mesos.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib.linux-x86_64-2.7
copying src/mesos_pb2.py -> build/lib.linux-x86_64-2.7
copying src/mesos.py -> build/lib.linux-x86_64-2.7
running build_ext
building '_mesos' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/native
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/../include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/python/native> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/protobuf-2.4.1/src> -I/usr/include/python2.7 -c native/proxy_executor.cpp -o build/temp.linux-x86_64-2.7/native/proxy_executor.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
In file included from /usr/include/python2.7/Python.h:8:0,
                 from native/proxy_executor.hpp:22,
                 from native/proxy_executor.cpp:21:
/usr/include/python2.7/pyconfig.h:1161:0: warning: "_POSIX_C_SOURCE" redefined [enabled by default]
/usr/include/features.h:164:0: note: this is the location of the previous definition
/usr/include/python2.7/pyconfig.h:1183:0: warning: "_XOPEN_SOURCE" redefined [enabled by default]
/usr/include/features.h:166:0: note: this is the location of the previous definition
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/../include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/python/native> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/protobuf-2.4.1/src> -I/usr/include/python2.7 -c native/mesos_scheduler_driver_impl.cpp -o build/temp.linux-x86_64-2.7/native/mesos_scheduler_driver_impl.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/../include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/python/native> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/protobuf-2.4.1/src> -I/usr/include/python2.7 -c native/module.cpp -o build/temp.linux-x86_64-2.7/native/module.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/../include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/python/native> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/protobuf-2.4.1/src> -I/usr/include/python2.7 -c native/proxy_scheduler.cpp -o build/temp.linux-x86_64-2.7/native/proxy_scheduler.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
In file included from /usr/include/python2.7/Python.h:8:0,
                 from native/proxy_scheduler.hpp:22,
                 from native/proxy_scheduler.cpp:21:
/usr/include/python2.7/pyconfig.h:1161:0: warning: "_POSIX_C_SOURCE" redefined [enabled by default]
/usr/include/features.h:164:0: note: this is the location of the previous definition
/usr/include/python2.7/pyconfig.h:1183:0: warning: "_XOPEN_SOURCE" redefined [enabled by default]
/usr/include/features.h:166:0: note: this is the location of the previous definition
gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/../include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/include> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/python/native> -I<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/protobuf-2.4.1/src> -I/usr/include/python2.7 -c native/mesos_executor_driver_impl.cpp -o build/temp.linux-x86_64-2.7/native/mesos_executor_driver_impl.o
cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]
g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro build/temp.linux-x86_64-2.7/native/proxy_executor.o build/temp.linux-x86_64-2.7/native/mesos_scheduler_driver_impl.o build/temp.linux-x86_64-2.7/native/module.o build/temp.linux-x86_64-2.7/native/proxy_scheduler.o build/temp.linux-x86_64-2.7/native/mesos_executor_driver_impl.o <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos_no_third_party.a> <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/protobuf-2.4.1/src/.libs/libprotobuf.a> <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/glog-0.3.1/.libs/libglog.a> <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/leveldb/libleveldb.a> <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/zookeeper-3.3.4/src/c/.libs/libzookeeper_mt.a> <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/libprocess/.libs/libprocess.a> <https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/third_party/libprocess/third_party/libev-3.8/.libs/libev.a> -o build/lib.linux-x86_64-2.7/_mesos.so -lrt
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
copying build/lib.linux-x86_64-2.7/mesos_pb2.py -> build/bdist.linux-x86_64/egg
copying build/lib.linux-x86_64-2.7/mesos.py -> build/bdist.linux-x86_64/egg
copying build/lib.linux-x86_64-2.7/_mesos.so -> build/bdist.linux-x86_64/egg
byte-compiling build/bdist.linux-x86_64/egg/mesos_pb2.py to mesos_pb2.pyc
byte-compiling build/bdist.linux-x86_64/egg/mesos.py to mesos.pyc
creating stub loader for _mesos.so
byte-compiling build/bdist.linux-x86_64/egg/_mesos.py to _mesos.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying src/mesos.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
writing build/bdist.linux-x86_64/egg/EGG-INFO/native_libs.txt
zip_safe flag not set; analyzing archive contents...
creating dist
creating 'dist/mesos-0.11.0-py2.7-linux-x86_64.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
make[2]: Leaving directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src'>
make[1]: Leaving directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src'>
Making all in ec2
make[1]: Entering directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/ec2'>
make[1]: Nothing to be done for `all'.
make[1]: Leaving directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/ec2'>
Making all in hadoop
make[1]: Entering directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop'>
make[1]: Nothing to be done for `all'.
make[1]: Leaving directory `<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop'>
+ cd hadoop
+ GLOG_v=1
+ MESOS_VERBOSE=1
+ make hadoop-0.20.205.0
if test "../.." != ".."; then \
          cp -p ../../hadoop/TUTORIAL.sh .; \
          cp -p ../../hadoop/hadoop-0.20.205.0.patch .; \
          cp -p ../../hadoop/hadoop-0.20.205.0_hadoop-env.sh.patch .; \
          cp -p ../../hadoop/hadoop-0.20.205.0_mesos.patch .; \
          cp -p ../../hadoop/mapred-site.xml.patch .; \
          cp -rp ../../hadoop/mesos .; \
          cp -p ../../hadoop/mesos-executor .; \
        fi
rm -rf hadoop-0.20.205.0

Welcome to the tutorial on running Apache Hadoop on top of Mesos!
During this interactive guide we'll ask some yes/no
questions and you should enter your answer via 'Y' or 'y' for yes and
'N' or 'n' for no.

Let's begin!


We'll try and grab hadoop-0.20.205.0 for you now via:

  $ wget http://apache.cs.utah.edu/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz


--2013-06-18 21:09:11--  http://apache.cs.utah.edu/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz
Resolving apache.cs.utah.edu (apache.cs.utah.edu)... 155.98.64.87
Connecting to apache.cs.utah.edu (apache.cs.utah.edu)|155.98.64.87|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2013-06-18 21:09:11 ERROR 404: Not Found.


Oh no! We failed to run 'wget http://apache.cs.utah.edu/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz'. If you need help try emailing:

  mesos-dev@incubator.apache.org

(Remember to include as much debug information as possible.)

make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure