You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/12/09 18:00:45 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #6780

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6780/display/redirect>

------------------------------------------
[...truncated 146.60 KB...]
Task ':beam-sdks-python:hdfsIntegrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh
Successfully started process 'command 'sh''
++ dirname ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh
+ TEST_DIR=./apache_beam/io/hdfs_integration_test
+ ROOT_DIR=./apache_beam/io/hdfs_integration_test/../../../../..
+ CONTEXT_DIR=./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ rm -r ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
rm: cannot remove './apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration': No such file or directory
+ true
+ mkdir -p ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks
+ cp ./apache_beam/io/hdfs_integration_test/docker-compose.yml ./apache_beam/io/hdfs_integration_test/Dockerfile ./apache_beam/io/hdfs_integration_test/hdfscli.cfg ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../sdks/python ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../model ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
++ echo hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6780
+ PROJECT_NAME=hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6780
+ '[' -z jenkins-beam_PostCommit_Python_Verify-6780 ']'
+ COLOR_OPT=--no-ansi
+ COMPOSE_OPT='-p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6780 --no-ansi'
+ cd ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ docker network prune --force
runtime/cgo: pthread_create failed: Resource temporarily unavailable
SIGABRT: abort
PC=0x7f9b251a6428 m=3

goroutine 0 [idle]:

goroutine 1 [runnable, locked to thread]:
runtime.gcenable()
	/usr/local/go/src/runtime/mgc.go:193
runtime.main()
	/usr/local/go/src/runtime/proc.go:151 +0x12a
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1

rax    0x0
rbx    0x7f9b25536700
rcx    0x7f9b251a6428
rdx    0x6
rdi    0x3435
rsi    0x342f
rbp    0xf1d11e
rsp    0x7f9b2476a9b8
r8     0x7f9b25537770
r9     0x7f9b2476b700
r10    0x8
r11    0x206
r12    0x7f9b1c0008c0
r13    0xf3
r14    0x30
r15    0x3
rip    0x7f9b251a6428
rflags 0x206
cs     0x33
fs     0x0
gs     0x0
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 0.073 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:postCommitIT FAILED
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"


###########################################################################
# Run tests and validate that jobs finish successfully.

>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20
echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
Process SyncManager-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 558, in _run_server
    server.serve_forever()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 184, in serve_forever
    t.start()
  File "/usr/lib/python2.7/threading.py", line 736, in start
    _start_new_thread(self.__bootstrap, ())
error: can't start new thread
interrupted
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 0.944 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313

* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 35s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/6n44jm3figdts

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #6785

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6785/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #6784

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6784/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-4150] Use unwindowed coder in FnApiRunner optimization phases.

[robertwb] [BEAM-6186] Optimization cleanup: move phase utilities out of local

------------------------------------------
[...truncated 150.99 KB...]
main.initializeDockerCli(0xc4203c9960, 0xc4200986c0, 0xc420361220)
	/usr/src/docker/.gopath/src/github.com/docker/docker/cmd/docker/docker.go:139 +0x76
main.setValidateArgs.func1.1(0xc42040f440, 0xc42045a2f0, 0x0, 0x1, 0x0, 0x0)
	/usr/src/docker/.gopath/src/github.com/docker/docker/cmd/docker/docker.go:125 +0x4e
github.com/docker/docker/vendor/github.com/spf13/cobra.(*Command).ValidateArgs(0xc42040f440, 0xc42045a2f0, 0x0, 0x1, 0x0, 0x0)
	/usr/src/docker/.gopath/src/github.com/docker/docker/vendor/github.com/spf13/cobra/command.go:771 +0x59
github.com/docker/docker/vendor/github.com/spf13/cobra.(*Command).execute(0xc42040f440, 0xc42000c330, 0x1, 0x1, 0xc42040f440, 0xc42000c330)
	/usr/src/docker/.gopath/src/github.com/docker/docker/vendor/github.com/spf13/cobra/command.go:622 +0x1c2
github.com/docker/docker/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc4200a1d40, 0xc4203d8240, 0xc42037bf50, 0xc4200404e0)
	/usr/src/docker/.gopath/src/github.com/docker/docker/vendor/github.com/spf13/cobra/command.go:742 +0x377
github.com/docker/docker/vendor/github.com/spf13/cobra.(*Command).Execute(0xc4200a1d40, 0xc4200a1d40, 0x1517260)
	/usr/src/docker/.gopath/src/github.com/docker/docker/vendor/github.com/spf13/cobra/command.go:695 +0x2b
main.main()
	/usr/src/docker/.gopath/src/github.com/docker/docker/cmd/docker/docker.go:169 +0xcb

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1

goroutine 5 [syscall]:
os/signal.signal_recv(0x0)
	/usr/local/go/src/runtime/sigqueue.go:116 +0x157
os/signal.loop()
	/usr/local/go/src/os/signal/signal_unix.go:22 +0x22
created by os/signal.init.1
	/usr/local/go/src/os/signal/signal_unix.go:28 +0x41

goroutine 8 [syscall]:
net/http.(*Transport).tryPutIdleConn(0xc4200e4960, 0xc4202c7a00, 0x0, 0x0)
	/usr/local/go/src/net/http/transport.go:677 +0x79c
net/http.(*persistConn).readLoop.func2(0x0, 0xc4200e4b01)
	/usr/local/go/src/net/http/transport.go:1391 +0x40
net/http.(*persistConn).readLoop(0xc4202c7a00)
	/usr/local/go/src/net/http/transport.go:1548 +0x85d
created by net/http.(*Transport).dialConn
	/usr/local/go/src/net/http/transport.go:1062 +0x4e9

goroutine 9 [select]:
net/http.(*persistConn).writeLoop(0xc4202c7a00)
	/usr/local/go/src/net/http/transport.go:1646 +0x3bd
created by net/http.(*Transport).dialConn
	/usr/local/go/src/net/http/transport.go:1063 +0x50e

rax    0x0
rbx    0x7f61b8ee8700
rcx    0x7f61b8b58428
rdx    0x6
rdi    0x340b
rsi    0x2c04
rbp    0xf1d11e
rsp    0x7f61b711a988
r8     0x7f61b8ee9770
r9     0x7f61b711b700
r10    0x8
r11    0x202
r12    0x7f61a80008c0
r13    0xf3
r14    0x30
r15    0x3
rip    0x7f61b8b58428
rflags 0x202
cs     0x33
fs     0x0
gs     0x0
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 0.088 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:postCommitIT FAILED
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"


###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
OpenBLAS blas_thread_init: pthread_create: Resource temporarily unavailable
OpenBLAS blas_thread_init: RLIMIT_NPROC 10240 current, 10240 max
Process SyncManager-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 558, in _run_server
    server.serve_forever()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 184, in serve_forever
    t.start()
  File "/usr/lib/python2.7/threading.py", line 736, in start
    _start_new_thread(self.__bootstrap, ())
error: can't start new thread
interrupted
./scripts/run_integration_test.sh: line 206: 11234 Segmentation fault      (core dumped) python setup.py nosetests --test-pipeline-options="$PIPELINE_OPTS" $TEST_OPTS
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 1.113 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 139

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313

* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 139

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 36s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/27k5vnuykuyms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #6783

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6783/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-6167] Add class ReadFromTextWithFilename (Python) (#7193)

------------------------------------------
[...truncated 109.88 KB...]
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/pipeline_analyzer_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/cache_manager.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/cache_manager_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/display/pipeline_graph.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/display/interactive_pipeline_graph.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/display/display_manager.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/display/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/interactive/display/pipeline_graph_renderer.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/internal/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/internal/names.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/sdf_direct_runner_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/watermark_manager.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/clock.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/helper_transforms.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/direct_runner_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/transform_evaluator.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/sdf_direct_runner.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/test_direct_runner.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/util.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/direct_metrics.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/bundle_factory.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/direct_runner.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/executor.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/direct_metrics_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/evaluation_context.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/direct_userstate.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/test/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/job/manager.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/job/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/job/utils.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/portable_stager.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/portable_runner_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/fn_api_runner_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/fn_api_runner.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/local_job_service.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/job_server.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/flink_runner_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/stager.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/stager_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/portable_stager_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/local_job_service_main.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/portability/portable_runner.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/test_dataflow_runner.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/dataflow_runner.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/dataflow_metrics.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/dataflow_runner_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/dataflow_metrics_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/ptransform_overrides.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/template_runner_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/native_io/streaming_create.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/native_io/iobase_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/native_io/iobase.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/native_io/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/apiclient_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/apiclient.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/names.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/clients/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_messages.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/sdk_worker_main_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/data_plane.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/operation_specs.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/operations.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/sideinputs_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/statesampler_fast.pyx'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/opcounters_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/sdk_worker.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/sdk_worker_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/worker_id_interceptor_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/log_handler_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/statesampler_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/opcounters.pxd'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/logger.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/opcounters.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/logger_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/statesampler.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/sideinputs.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/statesampler_slow.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/bundle_processor.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/log_handler.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/statesampler_fast.pxd'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/worker_id_interceptor.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/operations.pxd'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/sdk_worker_main.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/runners/worker/data_plane_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/util_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/pickler.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/util.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/module_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/pickler_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/gcp/json_value_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/gcp/json_value.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/gcp/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/internal/gcp/auth.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/execution.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/execution.pxd'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/cells_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/execution_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/metricbase.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/cells.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/metric.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/metric_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/metrics/monitoring_infos.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/tools/coders_microbenchmark.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/tools/sideinput_microbenchmark.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/tools/map_fn_microbenchmark.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/tools/microbenchmarks_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/tools/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/tools/distribution_counter_microbenchmark.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/tools/utils.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/wordcount_debugging_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/windowed_wordcount.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/wordcount_debugging.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/avro_bitcoin.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/streaming_wordcount_debugging.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/wordcount_minimal.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/streaming_wordcount.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/wordcount.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/wordcount_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/fastavro_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/wordcount_minimal_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/streaming_wordcount_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/wordcount_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/bigquery_side_input_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/bigquery_side_input.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/bigquery_tornadoes_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/multiple_output_pardo_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/filters_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/datastore_wordcount.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/bigquery_tornadoes.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/custom_ptransform_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/group_with_coder_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/bigquery_schema.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/datastore_wordcount_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/mergecontacts.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/mergecontacts_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/group_with_coder.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/custom_ptransform.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/combiners_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/coders_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/coders.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/filters.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/cookbook/multiple_output_pardo.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/snippets/snippets_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/snippets/snippets.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/snippets/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/flink/flink_streaming_impulse.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/flink/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/distribopt_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/autocomplete_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/estimate_pi.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/tfidf.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/tfidf_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/autocomplete.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/top_wikipedia_sessions_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/distribopt.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/estimate_pi_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/top_wikipedia_sessions.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/game_stats_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/leader_board_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/user_score_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/hourly_team_score_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/leader_board_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/hourly_team_score.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/game_stats.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/leader_board.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/user_score_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/user_score.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/game_stats_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/game/hourly_team_score_it_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/juliaset/setup.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/juliaset/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/juliaset/juliaset_main.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/juliaset/juliaset/juliaset.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/juliaset/juliaset/juliaset_test.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/examples/complete/juliaset/juliaset/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/python_urns.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/common_urns.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_fn_api_pb2_grpc.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/endpoints_pb2.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_artifact_api_pb2.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_provision_api_pb2.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_provision_api_pb2_grpc.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_job_api_pb2_grpc.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_runner_api_pb2_grpc.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/endpoints_pb2_grpc.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/standard_window_fns_pb2.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/__init__.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_fn_api_pb2.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/standard_window_fns_pb2_grpc.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_job_api_pb2.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py'
adding 'apache-beam-2.10.0.dev0/apache_beam/portability/api/beam_runner_api_pb2.py'
Creating tar archive
removing 'apache-beam-2.10.0.dev0' (and everything under it)
sdist archive name: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build/apache-beam-2.10.0.dev0.tar.gz>
:beam-sdks-python:sdist (Thread[Task worker for ':',5,main]) completed. Took 5.896 secs.
:beam-sdks-python:installGcpTest (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:installGcpTest FAILED
Caching disabled for task ':beam-sdks-python:installGcpTest': Caching has not been enabled for the task
Task ':beam-sdks-python:installGcpTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && pip install -e <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/[gcp,test]>
Successfully started process 'command 'sh''
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python>
    Error [Errno 11] Resource temporarily unavailable while executing command python setup.py egg_info
Could not install packages due to an EnvironmentError: [Errno 11] Resource temporarily unavailable

:beam-sdks-python:installGcpTest (Thread[Task worker for ':',5,main]) completed. Took 0.354 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 25s
3 actionable tasks: 3 executed

Publishing build scan...
https://gradle.com/s/zq3nr5p3otado

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #6782

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6782/display/redirect>

------------------------------------------
[...truncated 148.54 KB...]
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../sdks/python ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../model ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
++ echo hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6782
+ PROJECT_NAME=hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6782
+ '[' -z jenkins-beam_PostCommit_Python_Verify-6782 ']'
+ COLOR_OPT=--no-ansi
+ COMPOSE_OPT='-p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6782 --no-ansi'
+ cd ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ docker network prune --force
runtime/cgo: pthread_create failed: Resource temporarily unavailable
SIGABRT: abort
PC=0x7f37d021c428 m=0

goroutine 0 [idle]:

goroutine 1 [running, locked to thread]:
runtime.systemstack_switch()
	/usr/local/go/src/runtime/asm_amd64.s:252 fp=0xc42005fe48 sp=0xc42005fe40
runtime.newproc(0x0, 0xee95c8)
	/usr/local/go/src/runtime/proc.go:2713 +0x8b fp=0xc42005fe90 sp=0xc42005fe48
runtime.init.3()
	/usr/local/go/src/runtime/proc.go:213 +0x35 fp=0xc42005feb0 sp=0xc42005fe90
runtime.init()
	/usr/local/go/src/runtime/write_err.go:14 +0x2ee fp=0xc42005ff48 sp=0xc42005feb0
runtime.main()
	/usr/local/go/src/runtime/proc.go:141 +0xf6 fp=0xc42005ffa0 sp=0xc42005ff48
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1 fp=0xc42005ffa8 sp=0xc42005ffa0

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1

rax    0x0
rbx    0x7f37d05ac700
rcx    0x7f37d021c428
rdx    0x6
rdi    0x4fb8
rsi    0x4fb8
rbp    0xf1d11e
rsp    0x7ffcdc592398
r8     0x7f37d05ad770
r9     0x7f37d0be6700
r10    0x8
r11    0x206
r12    0x1801050
r13    0xf3
r14    0x30
r15    0x3
rip    0x7f37d021c428
rflags 0x206
cs     0x33
fs     0x0
gs     0x0
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 0.074 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:postCommitIT
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
./scripts/run_integration_test.sh: fork: retry: Resource temporarily unavailable
./scripts/run_integration_test.sh: fork: retry: No child processes
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
Process SyncManager-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 558, in _run_server
    server.serve_forever()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 184, in serve_forever
    t.start()
  File "/usr/lib/python2.7/threading.py", line 736, in start
    _start_new_thread(self.__bootstrap, ())
error: can't start new thread
Traceback (most recent call last):
  File "setup.py", line 237, in <module>
    'test': generate_protos_first(test),
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/__init__.py",> line 143, in setup
    return distutils.core.setup(**attrs)
  File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
    dist.run_commands()
  File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
    self.run_command(cmd)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/commands.py",> line 158, in run
    TestProgram(argv=argv, config=self.__config)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 121, in __init__
    **extra_args)
  File "/usr/lib/python2.7/unittest/main.py", line 95, in __init__
    self.runTests()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 207, in runTests
    result = self.testRunner.run(self.test)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 365, in run
    testQueue = Queue()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 667, in temp
    token, exp = self._create(typeid, *args, **kwds)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 565, in _create
    conn = self._Client(self._address, authkey=self._authkey)
  File "/usr/lib/python2.7/multiprocessing/connection.py", line 175, in Client
    answer_challenge(c, authkey)
  File "/usr/lib/python2.7/multiprocessing/connection.py", line 432, in answer_challenge
    message = connection.recv_bytes(256)         # reject large message
EOFError

> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 3.481 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313

* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 44s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/f265ybdqdoxoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #6781

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6781/display/redirect>

------------------------------------------
[...truncated 148.55 KB...]
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../sdks/python ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../model ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
++ echo hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6781
+ PROJECT_NAME=hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6781
+ '[' -z jenkins-beam_PostCommit_Python_Verify-6781 ']'
+ COLOR_OPT=--no-ansi
+ COMPOSE_OPT='-p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6781 --no-ansi'
+ cd ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ docker network prune --force
runtime/cgo: pthread_create failed: Resource temporarily unavailable
SIGABRT: abort
PC=0x7f4cb89f2428 m=0

goroutine 0 [idle]:

goroutine 1 [running, locked to thread]:
runtime.systemstack_switch()
	/usr/local/go/src/runtime/asm_amd64.s:252 fp=0xc42005fe48 sp=0xc42005fe40
runtime.newproc(0x0, 0xee95c8)
	/usr/local/go/src/runtime/proc.go:2713 +0x8b fp=0xc42005fe90 sp=0xc42005fe48
runtime.init.3()
	/usr/local/go/src/runtime/proc.go:213 +0x35 fp=0xc42005feb0 sp=0xc42005fe90
runtime.init()
	/usr/local/go/src/runtime/write_err.go:14 +0x2ee fp=0xc42005ff48 sp=0xc42005feb0
runtime.main()
	/usr/local/go/src/runtime/proc.go:141 +0xf6 fp=0xc42005ffa0 sp=0xc42005ff48
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1 fp=0xc42005ffa8 sp=0xc42005ffa0

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1

rax    0x0
rbx    0x7f4cb8d82700
rcx    0x7f4cb89f2428
rdx    0x6
rdi    0x475b
rsi    0x475b
rbp    0xf1d11e
rsp    0x7fff043a4ff8
r8     0x7f4cb8d83770
r9     0x7f4cb93bc700
r10    0x8
r11    0x202
r12    0x1890050
r13    0xf3
r14    0x30
r15    0x3
rip    0x7f4cb89f2428
rflags 0x202
cs     0x33
fs     0x0
gs     0x0
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 0.073 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:postCommitIT
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
./scripts/run_integration_test.sh: fork: retry: Resource temporarily unavailable
./scripts/run_integration_test.sh: fork: retry: No child processes
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
Process SyncManager-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 558, in _run_server
    server.serve_forever()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 184, in serve_forever
    t.start()
  File "/usr/lib/python2.7/threading.py", line 736, in start
    _start_new_thread(self.__bootstrap, ())
error: can't start new thread
Traceback (most recent call last):
  File "setup.py", line 237, in <module>
    'test': generate_protos_first(test),
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/__init__.py",> line 143, in setup
    return distutils.core.setup(**attrs)
  File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
    dist.run_commands()
  File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
    self.run_command(cmd)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/commands.py",> line 158, in run
    TestProgram(argv=argv, config=self.__config)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 121, in __init__
    **extra_args)
  File "/usr/lib/python2.7/unittest/main.py", line 95, in __init__
    self.runTests()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 207, in runTests
    result = self.testRunner.run(self.test)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 365, in run
    testQueue = Queue()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 667, in temp
    token, exp = self._create(typeid, *args, **kwds)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 565, in _create
    conn = self._Client(self._address, authkey=self._authkey)
  File "/usr/lib/python2.7/multiprocessing/connection.py", line 175, in Client
    answer_challenge(c, authkey)
  File "/usr/lib/python2.7/multiprocessing/connection.py", line 432, in answer_challenge
    message = connection.recv_bytes(256)         # reject large message
EOFError

> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 3.516 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313

* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 40s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/wwghxdb3ifxpc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org