You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/12/05 11:54:13 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #6746

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6746/display/redirect?page=changes>

Changes:

[robertwb] Revert "Revert "Optimize several Python coder implementations.""

[robertwb] [BEAM-6153] Stricter interval window comparison.

------------------------------------------
[...truncated 148.47 KB...]
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 3 tests in 24.612s

OK
:beam-sdks-python:directRunnerIT (Thread[Task worker for ':',5,main]) completed. Took 51.853 secs.
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:hdfsIntegrationTest
Caching disabled for task ':beam-sdks-python:hdfsIntegrationTest': Caching has not been enabled for the task
Task ':beam-sdks-python:hdfsIntegrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh
Successfully started process 'command 'sh''
++ dirname ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh
+ TEST_DIR=./apache_beam/io/hdfs_integration_test
+ ROOT_DIR=./apache_beam/io/hdfs_integration_test/../../../../..
+ CONTEXT_DIR=./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ rm -r ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
rm: cannot remove './apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration': No such file or directory
+ true
+ mkdir -p ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks
+ cp ./apache_beam/io/hdfs_integration_test/docker-compose.yml ./apache_beam/io/hdfs_integration_test/Dockerfile ./apache_beam/io/hdfs_integration_test/hdfscli.cfg ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../sdks/python ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../model ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
++ echo hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6746
+ PROJECT_NAME=hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6746
+ '[' -z jenkins-beam_PostCommit_Python_Verify-6746 ']'
+ COLOR_OPT=--no-ansi
+ COMPOSE_OPT='-p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6746 --no-ansi'
+ cd ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ docker network prune --force
+ trap finally EXIT
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6746 --no-ansi build
namenode uses an image, skipping
datanode uses an image, skipping
Building test
Step 1/9 : FROM python:2
 ---> f67e752245d6
Step 2/9 : WORKDIR /app
 ---> Using cache
 ---> 1c741002a3ed
Step 3/9 : ENV HDFSCLI_CONFIG /app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg
 ---> Using cache
 ---> 7962da95064b
Step 4/9 : RUN pip install --no-cache-dir holdup gsutil
 ---> Using cache
 ---> 7871e416a64c
Step 5/9 : RUN gsutil cp gs://dataflow-samples/shakespeare/kinglear.txt .
 ---> Using cache
 ---> b78f9af4ffec
Step 6/9 : ADD sdks/python /app/sdks/python
 ---> ea14e5d836cf
Removing intermediate container 6e72afc3f6b6
Step 7/9 : ADD model /app/model
 ---> 2004b9dc8a96
Removing intermediate container b168dbdfe862
Step 8/9 : RUN cd sdks/python &&     python setup.py sdist &&     pip install --no-cache-dir $(ls dist/apache-beam-*.tar.gz | tail -n1)[gcp]
 ---> Running in c2a51f08513f
Service 'test' failed to build: grpc: the connection is unavailable
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6746 --no-ansi down
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-6746_test_net
Network hdfs_it-jenkins-beam_postcommit_python_verify-6746_test_net not found.

real	0m0.334s
user	0m0.268s
sys	0m0.061s

> Task :beam-sdks-python:hdfsIntegrationTest FAILED
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 3.102 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:postCommitIT
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"


>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20
###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 18 tests in 2791.587s

OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_50-3042368228449364011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_14_40-9023293289021848339?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_21_58-14904918531373409048?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_28_13-6927377882628187980?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_34_48-1688246799609258910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_41_44-10768632471883401012?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_47_55-3141901977035436610?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_54-14722817272086250974?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_51-13162047459022426161?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_54-2398958797697606945?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_20_34-6489798816311569896?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_26_21-16901728638329805053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_32_23-1178423467761972895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_51-12488358383907802308?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_51-4421239659566823623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_15_07-11056016722511189414?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_22_04-9825133076994189264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_28_19-18206741095925971567?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_51-15525486192554631807?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_14_57-10272901095950453552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_21_49-12080722191830311440?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_07_51-17257324483868289901?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_03_16_17-10152087011308562207?project=apache-beam-testing.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 46 mins 32.42 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313

* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 48m 2s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/mhisccbgabque

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #6748

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6748/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #6747

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6747/display/redirect>

------------------------------------------
[...truncated 148.09 KB...]
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 3 tests in 23.959s

OK
:beam-sdks-python:directRunnerIT (Thread[Task worker for ':',5,main]) completed. Took 50.665 secs.
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:hdfsIntegrationTest
Caching disabled for task ':beam-sdks-python:hdfsIntegrationTest': Caching has not been enabled for the task
Task ':beam-sdks-python:hdfsIntegrationTest' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh
Successfully started process 'command 'sh''
++ dirname ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh
+ TEST_DIR=./apache_beam/io/hdfs_integration_test
+ ROOT_DIR=./apache_beam/io/hdfs_integration_test/../../../../..
+ CONTEXT_DIR=./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ rm -r ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
rm: cannot remove './apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration': No such file or directory
+ true
+ mkdir -p ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks
+ cp ./apache_beam/io/hdfs_integration_test/docker-compose.yml ./apache_beam/io/hdfs_integration_test/Dockerfile ./apache_beam/io/hdfs_integration_test/hdfscli.cfg ./apache_beam/io/hdfs_integration_test/hdfs_integration_test.sh ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../sdks/python ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../model ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
++ echo hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6747
+ PROJECT_NAME=hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6747
+ '[' -z jenkins-beam_PostCommit_Python_Verify-6747 ']'
+ COLOR_OPT=--no-ansi
+ COMPOSE_OPT='-p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6747 --no-ansi'
+ cd ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ docker network prune --force
+ trap finally EXIT
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6747 --no-ansi build
namenode uses an image, skipping
datanode uses an image, skipping
Building test
Step 1/9 : FROM python:2
 ---> f67e752245d6
Step 2/9 : WORKDIR /app
 ---> Using cache
 ---> 1c741002a3ed
Step 3/9 : ENV HDFSCLI_CONFIG /app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg
 ---> Using cache
 ---> 7962da95064b
Step 4/9 : RUN pip install --no-cache-dir holdup gsutil
 ---> Using cache
 ---> 7871e416a64c
Step 5/9 : RUN gsutil cp gs://dataflow-samples/shakespeare/kinglear.txt .
 ---> Using cache
 ---> b78f9af4ffec
Step 6/9 : ADD sdks/python /app/sdks/python
 ---> c0684ba4352d
Removing intermediate container 4c64626831ac
Step 7/9 : ADD model /app/model
 ---> 7caf0d736e0b
Removing intermediate container 1580b3c2414c
Step 8/9 : RUN cd sdks/python &&     python setup.py sdist &&     pip install --no-cache-dir $(ls dist/apache-beam-*.tar.gz | tail -n1)[gcp]
 ---> Running in 4d62f148ac97
Service 'test' failed to build: grpc: the connection is unavailable
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6747 --no-ansi down
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-6747_test_net
Network hdfs_it-jenkins-beam_postcommit_python_verify-6747_test_net not found.

real	0m0.291s
user	0m0.204s
sys	0m0.082s

> Task :beam-sdks-python:hdfsIntegrationTest FAILED
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 2.723 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:postCommitIT
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 18 tests in 2830.659s

OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_47-4409041605397656479?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_15_55-11922243169380857359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_22_31-11063823310974501498?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_28_42-8088877519026382831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_35_12-13585539060263595324?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_41_44-1806031361338081684?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_46-8257815816746662919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_47-16182175540169385043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_14_22-13747152303631351363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_20_48-8887028493394544788?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_26_30-2150134304890381190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_46-8591068230946819372?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_46-6371225585374779994?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_08_52-8728370730378642819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_16_23-5344004936186672176?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_22_54-294956117798134872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_45-1522057012608226469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_07_48-6479274496564992192?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_15_48-5991428151396052903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_45-6490813255935993277?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_08_47-14022005629068223938?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_01_46-5628726503554062298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-05_04_09_37-8704687855426382113?project=apache-beam-testing.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 47 mins 11.424 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313

* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 48m 37s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/hmuo5yirafl2w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org