You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/02/19 00:54:30 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #7444

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7444/display/redirect>

------------------------------------------
[...truncated 419.46 KB...]
datanode_1  | 19/02/19 00:04:26 INFO datanode.DataNode: PacketResponder: BP-968606987-172.18.0.2-1550534615017:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/02/19 00:04:26 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/02/19 00:04:26 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/02/19 00:04:26 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-968899653_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f9577dcf320> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f9577dcf410> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f9577dcf488> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f9577dcf500> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f9577dcf578> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f9577dcf668> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f9577dcf6e0> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f9577dcf758> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f9577dcf7d0> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f9577dcf938> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f9577dcf9b0> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f9577dcfa28> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/02/19 00:04:29 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/02/19 00:04:31 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-ecf7dbb033d911e988990242ac120004/03f19dd6-757b-45be-9320-529fcc4fec68.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/02/19 00:04:32 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-ecf7dbb033d911e988990242ac120004/03f19dd6-757b-45be-9320-529fcc4fec68.py-wordcount-integration
datanode_1  | 19/02/19 00:04:32 INFO datanode.DataNode: Receiving BP-968606987-172.18.0.2-1550534615017:blk_1073741826_1002 src: /172.18.0.3:37650 dest: /172.18.0.3:50010
datanode_1  | 19/02/19 00:04:32 INFO DataNode.clienttrace: src: /172.18.0.3:37650, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-976117344_69, offset: 0, srvID: fbb00e2d-54ee-4db8-8a56-5d70d1c1c461, blockid: BP-968606987-172.18.0.2-1550534615017:blk_1073741826_1002, duration: 10761207
datanode_1  | 19/02/19 00:04:32 INFO datanode.DataNode: PacketResponder: BP-968606987-172.18.0.2-1550534615017:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/02/19 00:04:32 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-ecf7dbb033d911e988990242ac120004/03f19dd6-757b-45be-9320-529fcc4fec68.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-976117344_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.13 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7444_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7444_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7444_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7444_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7444_namenode_1 ... done
Aborting on container exit...

real	1m50.994s
user	0m1.289s
sys	0m0.238s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7444 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7444_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7444_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7444_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7444_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7444_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7444_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7444_test_net

real	0m0.875s
user	0m0.332s
sys	0m0.075s

> Task :beam-sdks-python:postCommitIT


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2970.865s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_12-1113271916929196307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_20_10-12873942318688954826?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_26_44-1908499138015540407?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_33_48-6730111105918484014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_41_07-16092933343442483237?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_47_36-13778888308608147701?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_11-168855057830853416?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_11-17353987032179813568?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_17_45-745995133498060446?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_26_56-14144200417133270174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_36_12-9882251500488110635?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_10-10401387883540877518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_10-9942352241824970799?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_12_31-13083625074615314507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_20_02-6994821134932027106?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_10-4772240625222016707?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_12_43-2058994000413779729?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_20_00-18210778676022541189?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_27_54-3353134687203959206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_10-8798395336956363923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_13_11-4339549763180038861?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_21_24-3050838856617931413?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_28_21-15089091800361840504?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_05_10-12475103699475021807?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_13_25-18391290351571358709?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_22_59-17011070286337654918?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 173

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 18s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/vjfzssbm6gay4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #7446

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7446/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7445

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7445/display/redirect>

------------------------------------------
[...truncated 262.50 KB...]
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_55-3028333081933049683?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_17_14-7243828520229846750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_26_00-5148218093039254492?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_37_36-14622499825878890687?project=apache-beam-testing.
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ERROR
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.

======================================================================
ERROR: test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run
    test(orig)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__
    return self.run(*arg, **kwarg)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run
    self.runTest(result)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py",> line 158, in test_big_query_standard_sql
    big_query_query_to_table_pipeline.run_bq_pipeline(options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py",> line 75, in run_bq_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 71, in run_pipeline
    self.result.cancel()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1232, in cancel
    self.job_id(), 'JOB_STATE_CANCELLED'):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 184, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 622, in modify_job_state
    self._client.projects_locations_jobs.Update(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 683, in Update
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1694, in request
    (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1434, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, headers)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1390, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python2.7/httplib.py", line 1136, in getresponse
    response.begin()
  File "/usr/lib/python2.7/httplib.py", line 453, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python2.7/httplib.py", line 409, in _read_status
    line = self.fp.readline(_MAXLINE + 1)
  File "/usr/lib/python2.7/socket.py", line 480, in readline
    data = self._sock.recv(self._rbufsize)
  File "/usr/lib/python2.7/ssl.py", line 756, in recv
    return self.read(buflen)
  File "/usr/lib/python2.7/ssl.py", line 643, in read
    v = self._sslobj.read(len)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)'

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 5241.658s

FAILED (SKIP=1, errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_56-8817760001247063961?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_19_36-2098326032617957306?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_28_16-18001957671131999443?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_55-18373880680592484396?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_54-2257599614504121072?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_54-10447927953172926320?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_12_15-11276334143669690275?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_20_13-12208108770575204851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_27_02-14791551138861445278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_35_56-7829288533119093149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_43_06-10014255843156380000?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_50_00-10821112910246133783?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_54-8442275851638419912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_12_35-2640307284154418709?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_20_58-2716235115824645213?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_27_32-18446097699069433021?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_54-3740940777954554501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_12_45-1942847434685186301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_19_46-17464513125222797997?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_04_54-8877419763897332516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_12_53-3000560901162925125?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_22_22-12135068209187524196?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 275

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 31m 50s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/4yrzo63ikej46

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org