You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/07/18 19:04:46 UTC
Build failed in Jenkins: beam_PostCommit_Python2 #2683
See <https://ci-beam.apache.org/job/beam_PostCommit_Python2/2683/display/redirect>
Changes:
------------------------------------------
[...truncated 12.74 MB...]
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
},
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "ReadFromSpanner/Reshuffle/RemoveRandomKeys.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s6"
},
"serialized_fn": "<string of 1232 bytes>",
"user_name": "ReadFromSpanner/Reshuffle/RemoveRandomKeys"
}
},
{
"kind": "ParallelDo",
"name": "s8",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "_ReadFromPartitionFn",
"type": "STRING",
"value": "apache_beam.io.gcp.experimental.spannerio._ReadFromPartitionFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:stream",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
},
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
}
],
"is_stream_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "ReadFromSpanner/Read From Partitions.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s7"
},
"serialized_fn": "<string of 912 bytes>",
"user_name": "ReadFromSpanner/Read From Partitions"
}
}
],
"type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: u'2020-07-18T18:16:56.766380Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2020-07-18_11_16_55-9755840982851975821'
location: u'us-central1'
name: u'beamapp-jenkins-0718181647-694768'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2020-07-18T18:16:56.766380Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-07-18_11_16_55-9755840982851975821]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2020-07-18_11_16_55-9755840982851975821
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_16_55-9755840982851975821?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-07-18_11_16_55-9755840982851975821 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:16:55.637Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-07-18_11_16_55-9755840982851975821.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:16:55.637Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-07-18_11_16_55-9755840982851975821. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:16:59.590Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.300Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.333Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step ReadFromSpanner/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.374Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.409Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.485Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.530Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.557Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Generate Partitions into ReadFromSpanner/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.590Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Reshuffle/AddRandomKeys into ReadFromSpanner/Generate Partitions
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.623Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into ReadFromSpanner/Reshuffle/AddRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.656Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Reshuffle/ReshufflePerKey/GroupByKey/Reify into ReadFromSpanner/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.692Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Reshuffle/ReshufflePerKey/GroupByKey/Write into ReadFromSpanner/Reshuffle/ReshufflePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.729Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into ReadFromSpanner/Reshuffle/ReshufflePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.763Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into ReadFromSpanner/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.791Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Reshuffle/RemoveRandomKeys into ReadFromSpanner/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.823Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromSpanner/Read From Partitions into ReadFromSpanner/Reshuffle/RemoveRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.858Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.915Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:00.944Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:01.026Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Workflow failed due to internal error. Please contact Google Cloud Support.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:01.085Z: JOB_MESSAGE_DEBUG: Executing wait step start13
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:01.350Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:01.411Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:01.448Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-18T18:17:02.308Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-07-18_11_16_55-9755840982851975821 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py27.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 64 tests in 3706.606s
FAILED (SKIP=7, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_28-14990433926899190025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_16_55-9755840982851975821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_17_23-16091624702842352466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_24_03-866739995997255729?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_31_01-15815430234782788260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_38_08-17800423651605555415?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_44_40-12176770960353054222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_50_59-17471614966231089786?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_57_44-9413439892243594538?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_22-5707938527256641587?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_24_40-12241230655919009187?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_31_52-12600490822818082285?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_39_35-15104344197651300548?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_46_54-7546264695392245396?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_25-17663000927417386880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_14_46-938091919218596068?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_21_10-6851699527891618349?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_28_15-17796500655768625495?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_35_48-10636575429485622630?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_42_09-6956113973202453276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_49_33-8716156893702751472?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_22-12979621417686968364?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_19_52-10093312920715553392?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_26_56-18081060418450798570?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_33_52-108541839819681694?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_41_39-9526144672691716035?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_48_13-4696593353391884344?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_23-7932484712101086078?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_11_22-16591302465667606096?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_18_49-14179807284888003417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_26_19-2451777519701569532?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_33_18-12773515854472967739?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_40_37-2634812474146478039?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_47_35-8749365084989080060?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_54_43-3611156046042170850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_22-9124941702560295199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_10_08-13196200323513451869?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_17_55-5559346636961832345?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_25_09-7678051354060137083?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_31_45-13631791356188159146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_38_23-14723146721317870540?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_45_27-17418332216900222544?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_52_08-9830723635842288218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_23-6967198521685801271?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_11_14-10452238449000862085?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_19_19-6460453673894769829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_28_09-6343985438063251315?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_35_31-2544572792869636197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_42_40-17714226706143669046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_49_16-259188582624256743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_03_24-5785205752124849222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_12_34-3899284989421301644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_22_21-14142648315709717112?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_29_09-14626644600884812663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_35_32-10926671329456693784?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-18_11_52_29-5959410418011845037?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 116
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 4m 14s
151 actionable tasks: 115 executed, 34 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/m3nxhbzzydmw6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Python2 #2684
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python2/2684/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org