You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/04/09 07:38:56 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #3104

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/3104/display/redirect>

------------------------------------------
[...truncated 212.59 KB...]
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 1340 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-09T06:15:16.221768Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-08_23_15_15-13893963118752672512'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409061507-539431'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-09T06:15:16.221768Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-08_23_15_15-13893963118752672512]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_15_15-13893963118752672512?project=apache-beam-testing
root: INFO: Job 2019-04-08_23_15_15-13893963118752672512 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-09T06:15:15.215Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-08_23_15_15-13893963118752672512. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-09T06:15:15.290Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-08_23_15_15-13893963118752672512.
root: INFO: 2019-04-09T06:15:18.120Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-09T06:15:18.861Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2019-04-09T06:15:19.452Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-09T06:15:19.483Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-09T06:15:19.538Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-09T06:15:19.583Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-09T06:15:19.656Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-09T06:15:19.718Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-09T06:15:19.748Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-09T06:15:19.800Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-09T06:15:19.840Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-09T06:15:19.888Z: JOB_MESSAGE_DETAILED: Unzipping flatten s10 for input s8.out
root: INFO: 2019-04-09T06:15:19.944Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-04-09T06:15:20.008Z: JOB_MESSAGE_DETAILED: Unzipping flatten s10-u13 for input s11-reify-value0-c11
root: INFO: 2019-04-09T06:15:20.056Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-09T06:15:20.108Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-09T06:15:20.166Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-04-09T06:15:20.207Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-09T06:15:20.253Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-09T06:15:20.311Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-09T06:15:20.373Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-09T06:15:20.413Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into FlatMap(<lambda at sideinputs_test.py:169>)/FlatMap(<lambda at sideinputs_test.py:169>)
root: INFO: 2019-04-09T06:15:20.467Z: JOB_MESSAGE_DETAILED: Fusing consumer FlatMap(<lambda at sideinputs_test.py:169>)/FlatMap(<lambda at sideinputs_test.py:169>) into start/Read
root: INFO: 2019-04-09T06:15:20.523Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-09T06:15:20.594Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-09T06:15:20.647Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-09T06:15:20.694Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-09T06:15:20.894Z: JOB_MESSAGE_DEBUG: Executing wait step start21
root: INFO: 2019-04-09T06:15:20.986Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-09T06:15:21.029Z: JOB_MESSAGE_BASIC: Executing operation side/Read
root: INFO: 2019-04-09T06:15:21.040Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-09T06:15:21.089Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-04-09T06:15:21.146Z: JOB_MESSAGE_DEBUG: Value "side/Read.out" materialized.
root: INFO: 2019-04-09T06:15:21.210Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-09T06:15:21.262Z: JOB_MESSAGE_BASIC: Executing operation FlatMap(<lambda at sideinputs_test.py:169>)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-04-09T06:15:21.305Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-09T06:15:21.368Z: JOB_MESSAGE_DEBUG: Value "FlatMap(<lambda at sideinputs_test.py:169>)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-04-09T06:15:21.465Z: JOB_MESSAGE_BASIC: Executing operation start/Read+FlatMap(<lambda at sideinputs_test.py:169>)/FlatMap(<lambda at sideinputs_test.py:169>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-09T06:15:34.607Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-09T06:16:39.822Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-09T06:16:39.854Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-09T06:17:58.437Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-09T06:17:58.499Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
root: INFO: 2019-04-09T07:15:21.417Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
root: INFO: 2019-04-09T07:15:21.580Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2019-04-08_23_15_15-13893963118752672512.
root: INFO: 2019-04-09T07:15:21.688Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-09T07:15:21.749Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-09T07:15:21.780Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-09T07:19:07.035Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-09T07:19:07.072Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-09T07:19:07.099Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-08_23_15_15-13893963118752672512 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 4414.604s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_55-5828118299630579038?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_15_15-13893963118752672512?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_55-5645393628079215282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_16_40-6266555908957426859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_56-1771203639456982828?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_15_25-14920052838666104051?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_55-428073556742840713?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_19_15-10340131324601527809?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_55-4633328153895644286?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_14_54-2402013314429681359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_54-12458540120754445380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_13_09-15426777932314078490?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_55-14847121870152942788?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_13_45-17808792995129078043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_05_55-8839280035774763921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-08_23_15_24-12041138594362284127?project=apache-beam-testing.

> Task :beam-sdks-python:validatesRunnerBatchTests FAILED

> Task :beam-sdks-python:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 14 tests in 1084.119s

OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_35-13259254779501109069?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_28_02-7468451505580487988?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_34-18281721391958353731?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_27_37-14172812279989351585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_34-6647971617436943209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_35-5697642979423879371?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_27_59-10121242802581605514?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_34-14402336729821996610?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_26_22-12075241212040180565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_34-719014745288366163?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_27_57-9777631612968271819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_34-10884029940023646652?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_28_02-4131118680899798970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_00_19_34-4184854659317421610?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 27s
61 actionable tasks: 44 executed, 17 from cache

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (1 retry remaining)...

A network error occurred.

If you require assistance with this problem, please report it via https://gradle.com/scans/help/plugin and include the following information via copy/paste.

----------
Gradle version: 5.2.1
Plugin version: 2.1
Request URL: https://scans-in.gradle.com/in/5.2.1/2.1
Request ID: 29e1ed8e-ce6d-44a2-b182-1cccaec39497
Exception: java.net.SocketTimeoutException: Read timed out
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #3106

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/3106/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #3105

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/3105/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-7010] MAX/MIN(STRING)

------------------------------------------
[...truncated 307.62 KB...]
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s16"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_21", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s18", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s17"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_22", 
        "user_name": "assert_that/Match"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s19", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "kind:varint"
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s4"
        }, 
        "serialized_fn": "ref_AppliedPTransform_compute/MapToVoidKey0_23", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-09T09:00:56.995816Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-09_02_00_56-13819156804711885751'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409090046-548549'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-09T09:00:56.995816Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-09_02_00_56-13819156804711885751]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_00_56-13819156804711885751?project=apache-beam-testing
root: WARNING: Waiting indefinitely for streaming job.
root: INFO: Job 2019-04-09_02_00_56-13819156804711885751 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-09T09:00:59.056Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-09T09:00:59.720Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-04-09T09:01:00.263Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-04-09T09:01:00.271Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-09T09:01:00.278Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-09T09:01:00.280Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-04-09T09:01:00.282Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-04-09T09:01:00.290Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-09T09:01:00.303Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-09T09:01:00.306Z: JOB_MESSAGE_DETAILED: Unzipping flatten s14 for input s12.out
root: INFO: 2019-04-09T09:01:00.309Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-04-09T09:01:00.311Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Decode Values
root: INFO: 2019-04-09T09:01:00.313Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Decode Values
root: INFO: 2019-04-09T09:01:00.316Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
root: INFO: 2019-04-09T09:01:00.318Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter into compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Values
root: INFO: 2019-04-09T09:01:00.320Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Decode Values
root: INFO: 2019-04-09T09:01:00.322Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-09T09:01:00.324Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Decode Values into assert_that/Create/Impulse
root: INFO: 2019-04-09T09:01:00.326Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-09T09:01:00.328Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
root: INFO: 2019-04-09T09:01:00.331Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
root: INFO: 2019-04-09T09:01:00.333Z: JOB_MESSAGE_DETAILED: Fusing consumer start/Decode Values into start/Impulse
root: INFO: 2019-04-09T09:01:00.335Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/PairWithVoidKey into compute/MapToVoidKey0
root: INFO: 2019-04-09T09:01:00.337Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-09T09:01:00.339Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Values into compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets
root: INFO: 2019-04-09T09:01:00.341Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-09T09:01:00.343Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Decode Values
root: INFO: 2019-04-09T09:01:00.345Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream into compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/PairWithVoidKey
root: INFO: 2019-04-09T09:01:00.347Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets into compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream
root: INFO: 2019-04-09T09:01:00.350Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2019-04-09T09:01:00.352Z: JOB_MESSAGE_DETAILED: Fusing consumer side/Decode Values into side/Impulse
root: INFO: 2019-04-09T09:01:00.364Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-09T09:01:00.407Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-09T09:01:00.458Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-09T09:01:00.613Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-04-09T09:01:00.626Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-09T09:01:00.631Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-04-09T09:01:02.950Z: JOB_MESSAGE_BASIC: Executing operation side/Impulse+side/Decode Values+compute/MapToVoidKey0+compute/MapToVoidKey0+compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/PairWithVoidKey+compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream
root: INFO: 2019-04-09T09:01:02.950Z: JOB_MESSAGE_BASIC: Executing operation start/Impulse+start/Decode Values+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/WriteStream
root: INFO: 2019-04-09T09:01:02.950Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/ReadStream+assert_that/Group/GroupByKey/MergeBuckets+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
root: INFO: 2019-04-09T09:01:02.950Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream+compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets+compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Values+compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter
root: INFO: 2019-04-09T09:01:02.962Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/Decode Values+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/WriteStream
root: INFO: 2019-04-09T09:01:37.713Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-09T09:02:01.649Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 14 tests in 1629.557s

FAILED (failures=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_01-10341468619025132831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_00_32-726084166424521269?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_03-17915902005007763959?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_00_49-10203848897187572696?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_00-15210977526279946255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_00_56-13819156804711885751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_00-9562184461170626544?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_00-3553646674675589274?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_00_06-12530423445304527893?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_52_59-15733927532905282660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_53_00-8307319880013643258?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_00_31-15819480925978615318?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_01_52_59-17049891697713304623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_02_00_25-4569287754656911178?project=apache-beam-testing.

> Task :beam-sdks-python:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build.gradle'> line: 259

* What went wrong:
Execution failed for task ':beam-sdks-python:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 45m 38s
61 actionable tasks: 44 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/u4xzhqem2bv7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org